Blog Archives

IT Disaster Recovery and Tech Trends

 

As we’ve seen in recent years, natural disasters can lead to long-term downtime for organizations. Because earthquakes, hurricanes, snow storms, or other events can put data centers and other corporate facilities out of commission for a while, it’s vital that companies have in place a comprehensive disaster recovery plan.

Disaster recovery (DR) is a subset of business continuity (BC), and like BC, it’s being influenced by some of the key trends in the IT industry, foremost among them:

  • Cloud services
  • Server and desktop virtualization
  • The proliferation of mobile devices in the workforce
  • The growing popularity of social networking as a business tool

These trends are forcing many organizations to rethink how they plan, test, and execute their DR strategies. CSO previously looked at how these trends are specifically affecting IT business continuity; as with BC, much of the impact they are having on DR is for the better. Still, IT and security executives need to consider how these developments can best be leveraged so that they improve, rather than complicate, DR efforts.

Source: 4 tech trends in IT disaster recovery | Data Center – InfoWorld.

Head over to the source and see how IT disaster recovery is being impacted by each of the four.

 

Innovations Continue At Symform

 

Symform, a revolutionary cloud storage and backup service, today announced enhancements to its Cloud Storage Network that improve the performance, security and international capabilities of Symform’s innovative peer-to-peer backup model. The new version accelerates data upload times for large data sets, offers more options for privacy control and supports long file path names and international characters. These features are in direct response to the global adoption of Symform’s Cloud Network by small and medium businesses in 150 countries and the continued explosive growth of digital data needing to be protected and stored.”At Symform, we are constantly searching for new and better ways to serve our fast-growing global customer base by offering a solution that is widely accessible and more affordable than costly, traditional cloud storage models,” said Praerit Garg, president and co-founder of Symform. “We take pride in offering the industry’s first decentralized cloud back-up and storage solution, and are continuing to innovate and perfect that model with each new release.”In a recent Symform survey, respondents overwhelmingly cited the cost of cloud storage as a top concern, particularly among resource-strapped small and mid-sized businesses SMBs. Symform offers a dramatic alternative to traditional ‘data center-reliant’ cloud storage models, using a peer-to-peer network of contributors and consumers that keeps costs to a minimum while ensuring the highest levels of security and reliability.

Source: Symform Continues to Innovate Cloud Storage Network and Peer-to-Peer Model With Faster Data Backup and Enhanced Security & Privacy | Virtual-Strategy Magazine.

One of the keys with technology is to improve and enhance while remaining secure and reliable.  It looks as if Symform is doing that while also keeping their service cost-effective.  Check the source to see what innovations came with the latest release.

 

The long life and slow death of the virtual server

A drawback of the virtual machine world?

Back before we spun up VMs on a whim to handle whatever application or platform we needed, every deployment was painstaking and time consuming. These servers would be carefully built by installing the OS from the ground up, tweaking the BIOS tweaks, installing drivers, and laying the applications or frameworks over all of above. We would back up that server to tape and hope the server would reach hardware obsolescence before it broke down.

In either case, the server that replaced this physical server would almost certainly be different, and the notion of restoring the bare-metal backup on a new physical server often meant more work than just starting fresh on the new hardware. This was especially true for Windows servers. Starting anew was a good way to clear out the cruft of years of operation and begin again with a blank slate.

In the world of server virtualization, the day for the organic refresh never arrives. Virtual servers don’t break down. They don’t become obsolete. They simply keep going, while the physical hardware cycles underneath them throughout their existence. In fact, the only reason to rebuild on a new VM is if the OS vendor has stopped supporting that version and there are no more security updates to be had. Even then, you’ll find a great many instances where that VM will continue to run forever or until it becomes compromised.

Paul Venezia makes some very interesting points.  Read the rest: The long life and slow death of the virtual server | Data Center – InfoWorld.

May be easier than you think …

to steal a virtual machine and its data.

Remember the email server or payroll system that you virtualized? Someone with administrator access to your virtual environment could easily swipe it and all the data without anybody knowing. Stealing a physical server out of a data center is very difficult and is sure to be noticed, stealing a virtual machine (VM), however, can be done from anywhere on your network, and someone could easily walk out with it on a flash drive in their pocket.

Virtualization offers many benefits over physical servers, but there are some pitfalls you should be aware of and protect against to avoid losing sensitive data. Because a virtual machine is encapsulated into a single virtual disk file that resides on a virtual host server it is not all that difficult for someone with the appropriate access to make a copy of that disk file and access any of the data on it. This is a fairly simple thing to do, and we will show you how to do it here so you can protect your environment against it.

There are basically two ways one could access the virtual disk (.vmdk) file of a virtual machine. The first would be using the ESX Service Console. If someone knew the root password or had a user account on the host, they could gain access to the VMFS volumes that contain the virtual machine files and use copy tools like Secure Copy, or SCP, to copy files from it. The second is using the vSphere/VMware Infrastructure Client which contains a built-in datastore browser; this is the method we will cover here.

The security key: Understand the unique challenges presented by a virtual environment.

The bottom line is there are multiple layers you need to protect to ensure your data is safe. Protect the data, the application, the operating system and the physical server, and make sure you also protect the virtualization layer. Don’t focus your security efforts in all those other areas and forget one that can compromise them all. Not understanding and addressing the security challenges that are unique to virtual environments can be a costly mistake that you don’t want to make. (Source: SearchVMWare.com)

Go to the source and read the rest of this interesting article.

Reblog this post [with Zemanta]

Dump IT assets and move to cloud?

An interesting prediction by Gartner.

Cloud computing will become so pervasive that by 2012, one out of five businesses will own no IT assets at all, the analyst firm Gartner is predicting.

The shift toward cloud services hosted outside the enterprise’s firewall will necessitate a major shift in the IT hardware markets, and shrink IT staff, Gartner said.

“The need for computing hardware, either in a data center or on an employee‘s desk, will not go away,” Gartner said. “However, if the ownership of hardware shifts to third parties, then there will be major shifts throughout every facet of the IT hardware industry. For example, enterprise IT budgets will either be shrunk or reallocated to more-strategic projects; enterprise IT staff will either be reduced or reskilled to meet new requirements, and/or hardware distribution will have to change radically to meet the requirements of the new IT hardware buying points.”

If Gartner is correct, the shift will have serious implications for IT professionals, but presumably many new jobs would be created in order to build the next wave of cloud services.

But it’s not just cloud computing that is driving a movement toward “decreased IT hardware assets,” in Gartner’s words. Virtualization and employees running personal desktops and laptops on corporate networks are also reducing the need for company-owned hardware. (Source: InfoWorld)

Check the source link above to see other Gartner predictions.

Reblog this post [with Zemanta]

Security starts with infrastructure assessment

Interesting article on cloud computing security.

Security professionals are facing the difficult challenge of extending security requirements to take advantage of cloud computing and software-as-a-service applications.

Particularly difficult is finding ways to secure the new boundaries between the enterprise, the cloud service and the end user while managing dependencies on off-premise infrastructure and privileged operators. And they have to do all this without inhibiting flexibility and agility.

It’s a challenge that security professionals have to overcome when considering this.

Research firm IDC predicts that 76% of U.S. organizations will use at least one SaaS-delivered application for business use by the close of 2009. Cloud-based services adoption is being driven by the business performance benefits and realized cost efficiencies. This isn’t new for those of us in IT. Mission critical information already is handled in the cloud for companies that outsource email services or maintain customer information in CRM systems such as Salesforce.com. The challenge for security teams is to safely integrate extended cloud capabilities into corporate policies and procedures.

The best approach?

Forrester recommends the usual checklist of cloud security requirements that any enterprise would have for internally hosted applications. Authenticate users and control access to applications, tightly log and audit privileged operations, protect sensitive data to prevent loss and meet compliance mandates, and reduce risk with rigorous vulnerability management, according to Forrester. Take into account differences in the SaaS vendor’s infrastructure and business practices when evaluating the sensitivity to security. For instance, expect the cloud vendor to be replicating data between data centers for performance and business continuity and expect to have a degree of shared resources with virtualized application environments. (Source: Cloud security begins with infrastructure assessment – Search Security.com)

Click the source to read the whole thing.

Related articles by Zemanta
Reblog this post [with Zemanta]
%d bloggers like this: