Category Archives: Network Administration

IT Disaster Recovery and Tech Trends

 

As we’ve seen in recent years, natural disasters can lead to long-term downtime for organizations. Because earthquakes, hurricanes, snow storms, or other events can put data centers and other corporate facilities out of commission for a while, it’s vital that companies have in place a comprehensive disaster recovery plan.

Disaster recovery (DR) is a subset of business continuity (BC), and like BC, it’s being influenced by some of the key trends in the IT industry, foremost among them:

  • Cloud services
  • Server and desktop virtualization
  • The proliferation of mobile devices in the workforce
  • The growing popularity of social networking as a business tool

These trends are forcing many organizations to rethink how they plan, test, and execute their DR strategies. CSO previously looked at how these trends are specifically affecting IT business continuity; as with BC, much of the impact they are having on DR is for the better. Still, IT and security executives need to consider how these developments can best be leveraged so that they improve, rather than complicate, DR efforts.

Source: 4 tech trends in IT disaster recovery | Data Center – InfoWorld.

Head over to the source and see how IT disaster recovery is being impacted by each of the four.

 

Email In Security Hot Seat

As technology evolves with the rise of the cloud and BYOD, so does the debate on keeping corporate information secure.

Many companies also require remote wiping capability on employee devices in case they are lost or stolen, plus communication encryption software. They also require employees not to use a single password for multiple sites, and some are forbidding passwords of a single word.

But Parris, who formerly held technical and sales management positions at Boeing Computer Services and founded Intercede, argues that securing email also requires identity management — a system that creates a digital identity for employees and other third parties connected to an enterprise, which will then track, “who is sending which email and information to whom, when and protecting it in transit and at rest.”

Even that will not ensure protection of the email, he said. “It must also be run on a secure platform that delivers tightly controlled policy to enforce data labeling, digital message signing, encryption and checking of the actual content.”

Jeff Wilson, principal analyst for security at Infonetics, agrees that an email management platform would help, since “most people are getting email on [multiple] mobile devices that could be lost, stolen, or compromised.”

But he noted a more basic problem for many companies: “They don’t even have an accurate inventory of devices connecting to their network or a framework for building a security policy and buying appropriate security solutions.”

Since email is the primary method of information sharing, enterprises must keep it secure, “to protect intellectual property and to compete in the global business environment,” Parris said.

Source: Email in security hot seat with rise of cloud, BYOD | Consumerization Of It – InfoWorld.

Easy Cracking of Microsoft Crypto

Another day, another set of cracking tools.

Cryptography specialist Moxie Marlinspike released tools at Defcon today for easily cracking passwords in wireless and virtual private networks that use a popular encryption protocol based on an algorithm from Microsoft called MS-CHAPv2, news that will no doubt worry many a network administrator.The tools crack WPA2 Wi-Fi Protected Access and VPN passwords used by corporations and organizations running networks that are protected by the PPTP Point-to-Point Tunneling Protocol, which uses MS-CHAPv2 for authentication.ChapCrack captures the MS-CHAPv2 handshakes, or SSL Secure Sockets Layer negotiation communications, and converts them to a token that can be submitted to CloudCracker.It takes less than a day for the service to return results in the form of another token that is plugged back into ChapCrack where the DES Data Encryption Standard keys are cracked. With that data, someone can see all of the information traveling across the Wi-Fi network, including sensitive corporate e-mails and passwords, and use passwords that were revealed to log in to corporate networks.The tools are designed for penetration testers and network auditors to use to check the security of their WPA2 protected networks and VPNs, but they may well be used by people who want to steal data and get unauthorized access to networks.

Source: Tools boast easy cracking of Microsoft crypto for businesses | Security & Privacy – CNET News.

Yet another reason for businesses that haven’t done so yet to move beyond PPTP and Windows XP

“Do you need a honeypot?”

An interesting argument on how honeypots are an important key in the security arsenal.

Let’s start at the beginning, what is a honeypot? Put simply, it is a machine that is designed to tempt any unknowing attacker to target it, whilst being configured to trace the origins of the attacker and identify them. However, this can lead to the perception that honeypots can be a quagmire of risk and liability, as well as raising understandable concerns about willingly allowing an attacker to access your system under your control.

However, there are many forms of honeypots, and they can be used in many different ways. The idea of the honeypot as merely a host designed to be breached; sitting on the perimeter of your network is far from the whole picture. Let’s take a look over some different uses of honeypot style systems and consider their place in a well-equipped enterprise security program.

Building a fully-functional and interactive honeypot that resembles a real production system can be a daunting task, replete with risk (you would be, after all, building a machine with the intention of it falling
under the control of an attacker) but there are many other levels of honeypots before this level of complexity, and all of them present value to security monitoring.

Source: Do you need a honeypot?.

Very informative descriptions of some of the honeypot methods that are out there for use by organizations. As Conrad Constantine points out:

The use of honeypots, like everything in information security, is always evolving and the technique has a lot of potential to disrupt attackers by wasting their time and resources, directing them away from their true targets and forcing them to reveal themselves.

The long life and slow death of the virtual server

A drawback of the virtual machine world?

Back before we spun up VMs on a whim to handle whatever application or platform we needed, every deployment was painstaking and time consuming. These servers would be carefully built by installing the OS from the ground up, tweaking the BIOS tweaks, installing drivers, and laying the applications or frameworks over all of above. We would back up that server to tape and hope the server would reach hardware obsolescence before it broke down.

In either case, the server that replaced this physical server would almost certainly be different, and the notion of restoring the bare-metal backup on a new physical server often meant more work than just starting fresh on the new hardware. This was especially true for Windows servers. Starting anew was a good way to clear out the cruft of years of operation and begin again with a blank slate.

In the world of server virtualization, the day for the organic refresh never arrives. Virtual servers don’t break down. They don’t become obsolete. They simply keep going, while the physical hardware cycles underneath them throughout their existence. In fact, the only reason to rebuild on a new VM is if the OS vendor has stopped supporting that version and there are no more security updates to be had. Even then, you’ll find a great many instances where that VM will continue to run forever or until it becomes compromised.

Paul Venezia makes some very interesting points.  Read the rest: The long life and slow death of the virtual server | Data Center – InfoWorld.

“The firewall threat you don’t know”

Are you placing active filters on data leaving?

Simulação da participação de um Firewall entre...

(Photo credit: Wikipedia)

The purpose of a firewall has been burned into the head of just about every person who uses the Internet, and the thought of functioning without protection from the bad people is sheer lunacy.

However, nearly all firewalls are unidirectional. They may protect you from nefarious pokes and prods from the nether regions of the Internet, but they’ll happily ship out any data you send from the inside. Only at the higher levels of enterprise IT do you see active filters for data leaving the network.

Paul Venezia makes a great point at the end:

As in so many facets of IT, to be forewarned is to be forearmed. The
quest for true network security and visibility is an ongoing struggle,
and even with all the notice in the world, there’s no winning this arms
race. But that doesn’t mean we can just quit. If you’re not watching
your outbound traffic now, plan on doing so as soon as possible. Whether
you start with something as “simple” as NTop or go for the big guns like the NIKSUN device, it’s a worthwhile investment of time and money — kinda like firewalls.

Read more at:  The firewall threat you don’t know | Data Center – InfoWorld.

Companies and Free Tools

From Room362:

Penetration Testing / Red Teaming requires the use of a lot of tools. I don’t mind getting called a “script kiddie” because I can accomplish more and faster when I don’t have to code every single task I need to do. This post is to point out companies that make this possible and give a small bit of thanks.

(If you’ve ever tried to convince a company to give something away for free, you can understand how big this really is) Some give a lot, some only one tool, but even one is more than some.

Get the list:  Companies that give back with free tools – Blog – Room362.com.

CompTia Security

 

Security is a major aspect of IT.  One of the great ways to take one’s IT security training to the next level is to obtain a CompTIA certification.  Here’s part of a great interview that Techopedia recently did with CompTIA’s director of product management, Carol Balkcom.

Techopedia: Many know CompTIA for its A+ certification. Tell us about your other security offerings.
Carol Balkcom: CompTIA Security+ is our first exam devoted entirely to security, and it was originally launched in 2002. All of our exams are “vendor neutral”, meaning that they aren’t tied to any one vendor’s products – and Security+ is no exception.
CompTIA A+ and Network+ also have security components in them, because of course today’s support technicians and network administrators must also be knowledgeable about security. As an aside, all three of these exams (A+, Network+, Security+) are on the U.S. Department of Defense Directive 8570 that requires certification for information assurance personnel. As a result, a large number of professionals have taken these certifications over the last few years.
To get back to our security offerings, earlier this year we formally launched the first in CompTIA’s “Mastery” series of exams, our CompTIA Advanced Security Practitioner (CASP).

Techopedia: Tell us more about Security+. What major subject areas are covered and who is the primary audience?
Carol Balkcom: The primary audience for Security+ is IT professionals with two or more years of hands-on, technical information security experience. There are Security+ certified professionals in all types of organizations, from the U.S. Navy to General Mills to the Archdiocese of Philadelphia. As to the subject areas in Security+, the broad knowledge “domains” are network security, compliance and operational security, threats and vulnerabilities, application, data and host security, access control and identity management, and cryptography.

Techopedia: What about CASP? Can you tell us more about the designation?
Carol Balkcom: For the CompTIA Advanced Security Practitioner (CASP), we recommend at least 10 years in IT and five years of hands-on technical security experience. It is intended for the security architect working in a large, multi-location organization. The CASP also looks at the security implications of business decisions, such as the acquisition of one company by another, as an example.

Be sure to check out the rest of interview, which includes Ms. Balkcom’s take on the certification vs. experience question.

Windows Server GUI on way out

The GUI for Windows Server will eventually be no more.  Here’s what Don Jones, over at RedmondMag.com, had to say about this eventuality.

Image representing Microsoft as depicted in Cr...

Image via CrunchBase

  • The full GUI experience on the Windows Server OS is now optional. Software meant to run on a server should not assume a GUI will be there, nor should it take for granted any of the many other dependencies that the full server OS has traditionally included. My analysis on this: It’s Microsoft’s shot across the bow. You’ll see a stronger position on this sometime in the future — maybe a few years off, but sometime. They want server apps to assume they’re running on what we used to call “Server Core.”
  • The recommended way to run the Server OS is without the GUI. Didja see that? No, you don’t have to think it’s a good idea — I’m not pushing acceptance. I’m pointing out what’s happening. These are the facts on the ground.
  • Microsoft has taken a (what I think is a very good) middle-ground step by introducing a “minimal GUI” mode in the server OS. That means you can have your GUI tools on the Server OS, as well as on your client computer, provided those GUI tools play by a few basic rules and don’t assume too many dependencies (like the presence of IE). They’ll have the full .NET Framework at their disposal, for example, which should help — especially if they’re tools based on the MMC architecture. So this gets you a “lighter” version of the Windows Server OS, but still lets you manage right on the console.My opinion, for what it’s worth: Anyone who thinks “minimal GUI” mode is anything more than a holding measure is crazy. To me, this clearly says Microsoft is trying to get us off the console for good. They know we’re not ready to give it up completely, so this is them trying to wean us off. Maybe I’m wrong on this — it’s happened before — but it sure seems that way when I look at the big picture.
  • Notwithstanding the “minimal GUI” mode, Microsoft is recommending to software developers to not assume a GUI will be present. The full, rich GUI experience happens on the client. Not allowed connect to your servers from your client computer? The suggestion appears to be “rethink your architecture.”

In other words make sure you know command line interface and how to remote into a server because before long that will be your only way to access Microsoft Server.

My opinion is that Microsoft is pointed toward a world of “headless servers:” Minimal functionality from the console, rich management from a client computer. This is a step in that direction, and it’s intended to put us, and software vendors, on notice. Me, I’m going to take the hint. I hope y’all do as well. Windows Server “8” is a chance to start getting on board with what Windows will become — it’s not throwing us directly into the fire, but I think we have to take the opportunity to start adapting to this new direction.

Enhanced by Zemanta

Troubleshooting a Slow Network Connection

Great tips on network troubleshooting from the CompTia IT Pro networking blog.

  • Problem at the physical layer: Many times, I’ve found that slow networks occur because of some sort of problem with a particular device(e.g., a cable modem or a switch), or even the network cable itself. If you’re using a cable modem, try restarting it before contacting anyone or going any further. Check to see that all physical connections are sound; a loose wire can mimic other problems. Start here, and you’ll be able to move forward with confidence. Additional issues can include firmware update problems. One time, I had a cable modem that simply “bricked” because my ISP’s automatic update procedure failed. Other times, I’ve found that a cable modem hasn’t fully installed a firmware update, causing slowness. Sometimes you need to get a new modem; other times, you simply need to either complete the firmware update or simply restart it.
  • Network service problem: Start with diagnosing DNS issues. We all know what a completely failed DNS server can do to you. But have you ever been in a situation where you go to a familiar URL (e.g., http://www.bbc.co.uk) and then the browser tells you that it is “looking up” or “resolving” the URL? It will eventually find the URL and resolve it for you. This problem is likely due to a problem with your DNS server of that of your ISP. Restart the service if it’s your own; if you’re using a DNS server provided by the ISP, either switch to a backup server or inform them that there’s a problem. As with the previous piece of advice, actually restarting your computer can help resolve this issue, too. Additional services to consider include domain controllers, Microsoft networking / Samba servers, and torrent services. In some cases, network traffic will run slow because your network isn’t configured to prioritize certain traffic types. In other cases, you’ll need to set up port forwarding so that certain traffic types on your network will be properly forwarded by your router. For those of you interested in how an enterprise network prioritizes traffic, check out the following link about QoS.
  • Computing device issue: I once had a friend of mine who was convinced that his company’s ISP was at fault for slow network speeds. It turned out that his system was infested with spyware, causing a serious slowdown in networking. Removing the spyware solved the problem. In another case, the computing device had a problem because it had the wrong software driver installed for the network card. Resolving that issue resolved the slowdown issue nicely.

Head to the source to find out some other things to investigate when dealing with a slow network connection.

Enhanced by Zemanta
%d bloggers like this: