The beginners with web hosting business are often very confident about handling the complete system on their own. The reality sets in when they dig in deep and encounter a lot of paradigms, which they weren’t ready for. Business starters initially think that hiring for a fully managed web hosting service provider will cost a lot, and it won’t sit in their budget. Later most of them become an easy target for the cybercrime, and then they realize the importance of hiring a fully managed hosting service provider.
Normally, your hosting account’s PHP configuration file exist in at /conf/php.ini and the setting applied globally for your account. But if there is need to change the default settings for one of your subdomain or subdirectory then follow these steps:
- Log in to your account via SSH
- Copy the complete file /conf/php.ini to the subdomain / subdirectory folder in which you would like the custom settings applied.
- Edit the new php.ini with the custom settings you want applied to the subdomain / subdirectory
- Inside the same folder where you copied the php.ini, create an .htaccess file which contains the below line:
This will instruct Apache to create that environmental variable available to all script in that directory and PHP will check it to determine where to search the php.ini.
Note: following step is very important
Certainly, you will not want the outside world to have access to your settings in your php.ini file. So what is the solution? Add the following line to the .htaccess file in the same directory:
Please note the following versions of OpenSSL:
- OpenSSL 1.0.1 through to 1.0.1f (inclusive) is vulnerable
- OpenSSL 1.0.1g is NOT vulnerable
- OpenSSL 1.0.0 is NOT vulnerable
- OpenSSL 0.9.8 is NOT vulnerable
To check whether your server is vulnerable, on CentOS / Red Hat, run:
rpm -qa openssl*
yum info openssl | egrep \”Package|Version|Release\”
On Ubuntu Server:
dpkg -l | grep openssl
(On Ubuntu, ensure the version returned matches the ones mentioned here.)
You can patch / upgrade the openssl version yourself. Just run the following commands on your appropriate server(s).
# For the Linux cPanel server:
- yum update openssl -y
- for service in sshd pure-ftpd httpd exim cpanel courier-imap ; do /etc/init.d/$service restart; done
- /etc/init.d/httpd stop;/etc/init.d/httpd startssl
# For the Linux Plesk server:
- yum update openssl -y
- /etc/init.d/psa stopall
- /etc/init.d/psa startall
- /etc/init.d/psa restart
# For the Linux plain server:
Empowered, the new Intel Xeon E7 will create servers equipped with a maximum of 80 physical cores, an ideal offer for critical applications.
Intel has just lifted curtain on its new processors, Xeon Westmere-EX new generation. These chips are engraved in 32 nm and integrate Hyper-Threading technology. They also provide instructions for using the AES-NI encryption and secure execution of TXT technology. This range is divided into three subsets: the Xeon E7-2800 (bisocket), E7-4800 (quadrisocket) and E7-8800 (octosocket).
If you are one of those who always keep an eye on new technology updates, then you surely know the benefits offered by Desktop Virtualization. If you are still unaware about this form of technology, then you must read this article. Desktop Virtualization or in short VDI simplify the management tasks and reduce the costs. These potential TCO (Total Cost Ownership) benefits are mainly noteworthy, but VDI deployments still often require you to make an up-front investment in new hardware and software. Particularly, the cost and management associated with storage for VDI deployments can be a big challenge, as you need to consider a solution that can handle the performance and capacity requirements that come along with VDI.
2013 was the year of a very advanced for cloud computing, one where the cloud models such as SaaS / PaaS / IaaS / DaaS / EaaS / or rather Everything-as-a-Service has emerged and accepted positively.
Today, demand for traditional hosting solutions decreasing, even if you announce a project and in case it does not appear in the term Cloud, eyebrows of many people will go up. In a few years, cloud computing with its new models of automatically resource consumption has established itself as a major trend in IT. But between tests, adoptions or marriages between more or less sensitive public cloud and private cloud, and ad effects, the reality of enterprise cloud remained questionable.
With the Cloud now being seen as the defacto standard in enterprise computing, many are beginning to question the security of the environments provided and what steps have been taken to protect now just their servers, but also the data that they are hosting. Here we will look at what are some of the perceived threats to Cloud users and how the risk of these threats can be reduced.
User access levels
It is the responsibility of Cloud vendors to make sure that staff members are only assigned the access level to the environment that is necessary for them to perform their jobs, therefore preventing them from having access to information that could be classified and isn’t necessary for them to fulfill their roles. This would mean providing support staff with access to the hardware infrastructure only, rather than with full administrator access that could offer them a way into Cloud VMs. On occasions there have been attempts by staff members who have been provided with a level of access that goes well beyond their authority to access and steal data for personal gain, whether this is for the purpose of whistle blowing or selling private strategic documents to rivals. In other situations where support employees have been assigned access levels well beyond what they require, inexperienced workers have made mistakes whilst performing specific tasks that have led to huge data loss and downtime. In short, as a Cloud hosting provider you should ensure that employees are provided with the correct access because their actions, malicious or not, could end up damaging your business and brand as well as customer confidence.
If you are getting this error on your website or got an error when visiting a website, you may not have realized the actual meaning of this error. Therefore, here are detailed explanations of the meaning of this error.
The error code 509 is a state (status code) of the web server, which despite being used by quite a few applications on web servers, not part of any standard or is defined in RFCs.
This is a generic error message that appears when the website where the message is shown, has reached its traffic limit (data transfer) defined.
IBM introduces eight new servers based on the Power7 + Processor, input and midrange.
Big Blue puts this famous processor Power7 which is now within the reach of SMEs. The new servers are optimized for analytical applications related to big data. They use technology breakthrough Watson (query tools, including DeepQA). Price is quite competitive with the proposed material convenience.
IBM also introduced new units of storage to simplify data storage in the cloud and reduce costs through consolidation.
Western Digital intends to provide SMEs large account distribution for storage in rack mount server format, based on Intel Atom with a maximum capacity of 16 TB, data preservation oriented with support for virtual environments.
A call for SMEs and its branch offices of large accounts that have opted for infrastructure storage type rack, Western Digital responds with a dedicated NAS server: the Sentinel RX4100.
In 1U, this model built in single-socket based on Intel ATOM (Dual core 1.8 GHz) is positioned as the DX4000 which seems a successor. 4-bay SATA 3 (6 GB/s), the maximum storage capacity is 16 TB HDD or SSD in a Windows Storage Server 2008 R2 essentials environment.