Sai Krishna D.

Hyderabad, Andhra Pradesh, India

Archives

Top ad unit

Search This Blog

Wednesday, December 28, 2011

The 10 biggest perks of working in IT


Regardless of what you do for a living, it’s easy to focus on the negatives of the job and let those things bring you down. However, most jobs have certain perks, and IT is no exception. This article discusses some of the benefits I’ve experienced over the years as a result of working in IT.

1: You get to meet lots of people

One of my absolute favorite things about working in IT is that you get to meet so many interesting people. Back in the mid-90s, for example, I worked for a large insurance company with about 1,000 users. I can honestly say that I knew most of those users on a first-name basis. Better still, even though I left the company about 15 years ago, some of the people I met there are still my best friends to this day.

Without a doubt, the greatest benefit that came from getting to know so many people was that I met my wife of 17 years as a direct result of working in IT. She was working in the marketing department at the time, and I met her because she called me to fix her printer.

2: The money can be good

Even though IT will probably never be the way that it was during the dot-com boom, IT does tend to pay better-than-average salaries. Of course, the pay level varies considerably from one company to the next and from one position to the next.

3: It’s easy to move around

One thing I have always noticed about IT is that it is relatively easy to move around. I have known plenty of IT pros who got bored with their position and switched to a different IT specialty with minimal effort. For instance, I have known network administrators who became database administrators and software developers who became network administrators.

4: You have personal freedom

IT pros tend to have a lot of personal freedom. I will be the first to admit that corporate culture can vary considerably from one organization to the next and that some organizations are more permissive than others. Even so, I can’t remember anyone ever making me punch a time clock or stick to a rigid break schedule. Most of the IT jobs I have had have allowed me to set my own hours and even work from home when I wanted to (within reason). Likewise, I have always had total freedom to decorate my office anyway I wanted.

5: You get to help people

Another great thing about working in IT is that you get to help a lot of people. Some people hate IT because they’re usually calling with a problem they want you to solve. Even so, I have always found it gratifying to be able to end the day knowing that I was able to spend it helping people.

6: You get paid to spend time away from the office

This may not apply to everybody, but one thing I have always enjoyed immensely about IT is the travel. The very nature of the job means that you constantly have to learn new things and oftentimes, this means traveling to training classes and technical conferences.

Although I do confess to being a travel junkie, there is also something very cool about being away from the office for a few days without having to burn up any of your vacation time. What’s even better is that technology conferences tend to be held in places where there are plenty of things to see and do after hours.

7: You sometimes face unusual challenges

Few things in life bring me down faster than monotony. While every job has some amount of repetition, IT has the unique advantage of requiring creative solutions to unusual problems. There is definitely something to be said for being challenged once in a while.

8: You have access to cool toys

A definite perk of working in IT is having access to cool toys. Just yesterday, for example, I had to spend several hours in a hospital waiting room, so I got some work done using my Windows 8 tablet. While doing so, several people stopped to ask me where I got the tablet, since Windows 8 won’t be out until sometime next year.

The same basic concept has always held true regardless of the hot technology of the moment. Back in the 90s, I remember using a flatbed scanner to copy pictures for my friends at a time when none of them had ever even heard of a scanner.

9: IT knowledge can be helpful in everyday life

Although perhaps not a job perk, IT knowledge can definitely be helpful in everyday life. For example, there was a time long, long ago when the network cabling standard of choice was coaxial Ethernet. I spent one entire summer pulling coaxial cable and attaching cable ends. At the time, I hated the job. But even though nobody uses coaxial Ethernet anymore, the knowledge I gained installing all that cable came in handy just last week.

My next-door neighbors had some carpet installed. The installer accidentally cut their satellite cable. The cable used by satellite dishes is similar to what was used for Ethernet so long ago. Since I still have my tools, I was able to repair the cable for them, so they didn’t have to wait a week and pay for a service call from the satellite company.

10: The job sometimes comes with special rewards

Earlier, I mentioned that one of the great things about working in IT is that you get to help people. Sometimes, people who you help are so grateful that they provide a special reward. Over the years, I have had clients send me various gifts as a way of saying thank-you for helping them out in a pinch. When I worked for the military, some of the people I helped even thanked me by taking me for joy rides in tanks and helicopters.

Don’t get me wrong — I don’t help people because I expect to get something in return. However, it is always a nice feeling when someone surprises you with a thank-you gift.

Other perks?

What other aspects of your IT job make you happy? Do the good things outweigh the bad?

https://www.facebook.com/pages/Management/226101317468066

The future of IT will be reduced to three kinds of jobs


There’s a general anxiety that has settled over much of the IT profession in recent years. It’s a stark contrast to the situation just over a decade ago. At the end of the 1990s, IT pros were the belles of the ball. The IT labor shortage regularly made headlines and IT pros were able to command excellent salaries by getting training and certification, job hopping, and, in many cases, being the only qualified candidate for a key position in a thinly-stretched job market. At the time, IT was held up as one of the professions of the future, where more and more of the best jobs would be migrating as computer-automated processes replaced manual ones.

Unfortunately, that idea of the future has disappeared, or at least morphed into something much different.



The glory days when IT pros could name their ticket evaporated when the Y2K crisis passed and then the dot com implosion happened. Suddenly, companies didn’t need as many coders on staff. Suddenly, there were a lot fewer startups buying servers and hiring sysadmins to run them.

Around the same time, there was also a general backlash against IT in corporate America. Many companies had been throwing nearly-endless amounts of money at IT projects in the belief that tech was the answer to all problems. Because IT had driven major productivity improvements during the 1990s, a lot of companies over-invested in IT and tried to take it too far too fast. As a result, there were a lot of very large, very expensive IT projects that crashed and burned.

When the recession of 2001 hit, these massively overbuilt IT departments were huge targets for budget cuts and many of them got hit hard. As the recession dragged out in 2002 and 2003, IT pros mostly told each other that they needed to ride out the storm and that things would bounce back. But, a strange thing happened. IT budgets remained flat year after year. The rebound never happened.

Fast forward to 2011. Most IT departments are a shadow of their former selves. They’ve drastically reduced the number of tech support professionals, or outsourced the help desk entirely. They have a lot fewer administrators running around to manage the network and the servers, or they’ve outsourced much of the data center altogether. These were the jobs that were at the center of the IT pro boom in 1999. Today, they haven’t totally disappeared, but there certainly isn’t a shortage of available workers or a high demand for those skill sets.

That’s because the IT environment has changed dramatically. More and more of traditional software has moved to the web, or at least to internal servers and served through a web browser. Many technophobic Baby Boomers have left the workforce and been replaced by Millennials who not only don’t need as much tech support, but often want to choose their own equipment and view the IT department as an obstacle to productivity. In other words, today’s users don’t need as much help as they used to. Cynical IT pros will argue this until they are blue in the face, but it’s true. Most workers have now been using technology for a decade or more and have become more proficient than they were a decade ago. Plus, the software itself has gotten better. It’s still horribly imperfect, but it’s better.

So where does that leave today’s IT professionals? Where will the IT jobs of the future be?

1. Consultants

Let’s face it, all but the largest enterprises would prefer to not to have any IT professionals on staff, or at least as few as possible. It’s nothing personal against geeks, it’s just that IT pros are expensive and when IT departments get too big and centralized they tend to become experts at saying, “No.” They block more progress than they enable. As a result, we’re going to see most of traditional IT administration and support functions outsourced to third-party consultants. This includes a wide range from huge multi-national consultancies to the one person consultancy who serves as the rented IT department for local SMBs. I’m also lumping in companies like IBM, HP, Amazon AWS, and Rackspace, who will rent out both data center capacity and IT professionals to help deploy, manage, and troubleshoot solutions. Many of the IT administrators and support professionals who currently work directly for corporations will transition to working for big vendors or consultancies in the future as companies switch to purchasing IT services on an as-needed basis in order to lower costs, get a higher level of expertise, and get 24/7/365 coverage.

2. Project managers

Most of the IT workers that survive and remain as employees in traditional companies will be project managers. They will not be part of a centralized IT department, but will be spread out in the various business units and departments. They will be business analysts who will help the company leaders and managers make good technology decisions. They will gather business requirements and communicate with stakeholders about the technology solutions they need, and will also be proactive in looking for new technologies that can transform the business. These project managers will also serve as the company’s point of contact with technology vendors and consultants. If you look closely, you can already see a lot of current IT managers morphing in this direction.

3. Developers

By far, the area where the largest number of IT jobs is going to move is into developer, programmer, and coder jobs. While IT used to be about managing and deploying hardware and software, it’s going to increasingly be about web-based applications that will be expected to work smoothly, be self-evident, and require very little training or intervention from tech support. The other piece of the pie will be mobile applications — both native apps and mobile web apps. As I wrote in my article, We’re entering the decade of the developer, the current changes in IT are “shifting more of the power in the tech industry away from those who deploy and support apps to those who build them.” This trend is already underway and it’s only going to accelerate over the next decade.

https://www.facebook.com/pages/Management/226101317468066

Top IT skills wanted for 2012


Nearly 29 percent of the 353 IT executives who were polled in Computerworld’s annual Forecast survey said they plan to increase IT staffing through next summer. (That’s up from 23% in the 2010 survey and 20% in the 2009 survey.)

Here are the skills that the IT executives say they will be hiring for:

Programming and Application Development–61% plan to hire for this skill in the next 12 months, up from 44% in the 2010 survey. This covers the gamut from website development to upgrading internal systems and meeting the needs of mobile users.
Project Management (but with a twist)– The twist is that they’re not going to just be looking for people who can oversee and monitor projects. They also want people who can identify users’ needs and translate them for the IT staffers-the increasingly popular business analysts.
Help Desk/Technical Support–Mobile operating systems have added a new dimension to help desk and tech support.
Networking-This demand is being fueled partially by virtualization and cloud computing projects. The survey also revealed that execs will be looking for people with VMware and Citrix experience.
Business Intelligence-Computerworld interprets this uptick to a focus shift in many companies,  from cost savings to investing in technology. That will be nice if it pans out that way.
Data Center-Virtualization and the Cloud could also be behind the increased need for IT professionals with backgrounds in data center operations and systems integration.
Web 2.0-Tech skills centered around social media will be in demand, with .Net, AJAX and PHP as key back-end skills, with HTML, XML, CSS, Flash and Javascript, among others, on the front end.
Security-Although down from 32 percent in the 2010 survey, security stays a top concern of IT executives.
Telecommunications-The survey indicates a demand for people with IP telephony skills, and for those familiar with Cisco IPCC call center systems.

https://www.facebook.com/pages/Management/226101317468066

Pros and cons of procuring your own IT in the cloud

Continuing my foray into the cloud with Amazon Web Services, I am now in a position to place information on the web, for the world to see. I created a new Amazon EC2 machine, carried out some security patching and installed a web server. I don’t have to worry about high availability, e-commerce or encrypting my data.

Now the server is ready to take content that will be available on the Internet. I can do this myself and bypass the traditional workflow in my enterprise. Unfortunately, this proves to be a double-edged sword.

I decide, in a moment of drivelling thickwittedness, to put up a single page with a few helpful details about me and my department. It’s easy enough to create a static page describing the department I work in. I don’t have to worry about high availability, e-commerce or encrypting my data. Of course, this is a trivial example: being able to start a car does not make me a driver. The more complex the business solution, the greater the technical challenges I have to overcome.

My single page can’t do any harm, can it? After all, the server is patched and the site is so simple it is practically unhackable. I don’t have to worry about intruders damaging the company reputation by defacement or other vandalism. Who could I annoy?

The marketing department. All the corporate branding is missing. Even the URL is not right. Who would really believe http://ec2-1-2-3-4.eu-west-1.compute.amazonaws.com/ is part of my enterprise? The marketing guys may accept an URL shortener like bit.ly, but not this.
The HR team. I am posting confidential data. The terms of my employment do not allow this.
The legal guys. I am sending personal data across national boundaries. I have caused regulatory compliance issues for my company.
The security team. They don’t know the first thing about this new service, let alone checked the information security.
Experiments like this must not be linked to the enterprise.

The pros and cons of procuring your own IT

If you are a department manager, be careful. You now face a unique challenge. The world of cloud computing is attracting direct interest from business leaders in a way that hyped technologies of the past have failed to do. How many managers have you seen tinkering with blade hardware or ESB software? Any business unit can now instantly procure its own IT, cutting many steps out of the traditional procedure. The minimal effort required to get to this point can be put in by anyone.

Let’s say an enterprise chief such as, ooh, say, Dr. Werner Vogels, the CTO of Amazon.com, has decided on a new strategy to empower every department to use cloud computing (and this is what led to the formation of AWS). A department manager, who previously relied on the IT department for all that arcane hands-on magic, may solve his headache of scarce IT resources by commissioning his own cloud services. The challenge is to achieve the positive effects and avoid the negative.

Pros

Money. The huge long term investment and short term maintenance costs are gone.
Time. It takes minutes to create virtual services.
Scale. The vast data centres of the big cloud players (AWS, Rackspace, Verizon, etc.) allow practically unlimited growth.
Cons

Regulatory compliance. The manager puts customer information in the cloud. The provider moves that information around the world and the company fails its regulatory compliance.
Security. The manager puts vulnerable applications in the cloud. Naughty people mount man-in-the-middle, cross-site scripting, and defacement attacks.
Cloud sprawl. All departments love commissioning their own IT solutions. The company ends up with dozens of unconnected e-mail systems, web sites, office apps, data stores and so on.
Repeating the mistakes of the past. Other little gotchas that burnt the IT department in the past will hurt the other departments again.
The safe path through this maze is to follow the direction of the consultants in the IT department. They know these waters - they have decades of experience evaluating services, keeping stakeholders happy, building business tools and so on. But how open is the typical IT department to helping a manager go elsewhere? I fear they will not jump at the opportunity - they will have to be pushed.

Missed a piece?

Follow the entire journey of working in the Amazon Web Services cloud from initial sign-up to building applications and beyond.
for more details pleas visit my page at : "https://www.facebook.com/pages/Management/226101317468066"

2012: The year that mobile tech stood still

I’m coming up on my “New Every Two” renewal, although that program has actually gone away as it used to exist for Verizon, and here’s the interesting thing — while I have a vague gnawing in the back of my mind about what new gadget I should pick up, and after having reviewed a half-dozen Android devices and a Windows 7 phone this year — I’m just not that excited about the new options in mobile phones that are currently available. Let’s discuss some of the concerns I have about smartphones in 2012.

First, I really like my Droid 2, and it does most of what I need pretty well. It’s a solid device with great durability and a very good slide-out keyboard. It has quirks, but it’s the devil I know — any future device is the devil I don’t know. I’ve been buying mobile phones since 1987, so I’ve had more than a few cases where my replacement phone left me longing for my previous device.

I’m afraid this speaks to a problem that smartphone manufacturers may run into. While Apple fans are happy to have a two year or even shorter recycle on their expensive mobile gadgets and have an almost irrational need to be on the latest and greatest version of iOS device, I don’t think the rest of the world is ready for such an aggressive upgrade cycle.

To me, it seems like devices are in a kind of purgatory where the release cycle doesn’t offer enough compelling reason to upgrade — and why it might be a better choice to wait another 18 months or so. A handful of devices on the market indicate some cool directions where mobile phones may be headed, but they’re not quite delivering on their promises yet.

Eventually, we need seamless integration of components with industry standards across manufacturers. For example, it would be wonderful if a keyboard dock from ASUS was port-functionally compatible with a device from Motorola. A modular, mix-and-match world of device compatibility — that is my dream. But there’s no way we’re going to see that in 2012. Still, some fledgling steps by Motorola are encouraging and make me wonder what kind of devices might hit the market by June or July 2012.

Recent news that the NTBS plans on recommending a comprehensive ban on talking while driving on mobile devices alone is troubling for the future of smartphones, as well. As proposed, this ban exempts in-car devices but not Bluetooth or other hands-free devices on regular mobile phones. Depending on how that shakes down, this could be a hot potato in 2012 for smartphones and wireless carriers.

Personally, I have an in-car phone that I rarely use. The voice recognition is fine for hands-free dialing, but there’s no real integration with my digital devices. I buy about 100 minutes a year and generally race to use them up at the end of my OnStar subscription period. This legislation could change all of that, because it makes a specific exemption for using devices installed by the car manufacturer.

If this legislation passes, we may see a move to Android head-units that replace factory stereos with in-car hands-free devices, as well as an increasing focus by auto manufacturers to build smart-device functionality into their cars at the factory. This is another place where there’s an outrageous opportunity for convergence.

Imagine an in-car device that hooks up to your factory speakers and offers hands-free, voice-activated, Siri-like functions — and the “detachable faceplate” becomes your mobile smartphone when you exit the car. I should really patent this design immediately, and then sue Apple, Samsung, and Motorola when they inevitably release this device. If this NTBS proposal is adopted as a federal mandate, I’d expect to see devices like this fast-tracked for consumer markets.

For the moment, there really isn’t a revolutionary or magical smartphone device shaking things up in the industry. Everything going on is a non-quantum evolution of current trends. When you show me a phone with 24-hours talk time and a 2 week standby time that delivers quad-core processing and is as thin as a RAZR, one that seamlessly docks into a number of accessories (including my in-car stereo head unit) and is built on an open standard, then you’ll have my attention.

How many of you think 2012 is the year when we’ll see such a device released to market? As for me, I’m not holding my breath for much more than the same old, same old in 2012.

Tuesday, December 27, 2011

10 Windows 7 commands every administrator should know:

PC troubleshooting is becoming less common in larger organizations, but consultants and techs in smaller shops still have to get their hands dirty identifying and fixing desktop problems. Often times, troubleshooting Windows 7 means delving into the command line. Here are 10 fundamental Windows 7 commands you might find helpful.

Before I begin…

This article is intended solely as an introduction to some useful troubleshooting commands. Many of them offer numerous optional switches, which I won’t cover here due to space limitations. You can find out more about each command by checking out TechNet’s command-line reference.

1: System File Checker

Malicious software will often attempt to replace core system files with modified versions in an effort to take control of the system. The System File Checker can be used to verify the integrity of the Windows system files. If any of the files are found to be missing or corrupt, they will be replaced. You can run the System File Checker by using this command:

sfc /scannow
2: File Signature Verification

One way to verify the integrity of a system is to make sure that all the system files are digitally signed. You can accomplish this with the File Signature Verification tool. This tool is launched from the command line but uses a GUI interface. It will tell you which system files are signed and which aren’t. As a rule, all the system files should be digitally signed, although some hardware vendors don’t sign driver files. The command used to launch the File Signature Verification tool is:

sigverif
3: Driverquery

Incorrect device drivers can lead to any number of system problems. If you want to see which drivers are installed on a Windows 7 system, you can do so by running the driverquery tool. This simple command-line tool provides information about each driver that is being used. The command is:

driverquery
If you need a bit more information, you can append the -v switch. Another option is to append the -si switch, which causes the tool to display signature information for the drivers. Here’s how they look:

driverquery -v
driverquery -si
4: Nslookup

The nslookup tool can help you to verify that DNS name resolution is working correctly. When you run nslookup against a host name, the tool will show you how the name was resolved, as well as which DNS server was used during the lookup. This tool can be extremely helpful when troubleshooting problems related to legacy DNS records that still exist but that are no longer correct.

To use this tool, just enter the nslookup command, followed by the name of the host you want to resolve. For example:

nslookup dc1.contoso.com
5: Ping

Ping is probably the simplest of all diagnostic commands. It’s used to verify basic TCP/IP connectivity to a network host. To use it, simply enter the command, followed by the name or IP address of the host you want to test. For example:

ping 192.168.1.1
Keep in mind that this command will work only if Internet Control Message Protocol (ICMP) traffic is allowed to pass between the two machines. If at any point a firewall is blocking ICMP traffic, the ping will fail.

6: Pathping

Ping does a good job of telling you whether two machines can communicate with one another over TCP/IP, but if a ping does fail, you won’t receive any information regarding the nature of the failure. This is where the pathping utility comes in.

Pathping is designed for environments in which one or more routers exist between hosts. It sends a series of packets to each router that’s in the path to the destination host in an effort to determine whether the router is performing slowly or dropping packets. At its simplest, the syntax for pathping is identical to that of the ping command (although there are some optional switches you can use). The command looks like this:

pathping 192.168.1.1
7: Ipconfig

The ipconfig command is used to view or modify a computer’s IP addresses. For example, if you wanted to view a Windows 7 system’s full IP configuration, you could use the following command:

ipconfig /all
Assuming that the system has acquired its IP address from a DHCP server, you can use the ipconfig command to release and then renew the IP address. Doing so involves using the following commands:

ipconfig /release
ipconfig /renew
Another handy thing you can do with ipconfig is flush the DNS resolver cache. This can be helpful when a system is resolving DNS addresses incorrectly. You can flush the DNS cache by using this command:

ipconfig /flushdns
8: Repair-bde

If a drive that is encrypted with BitLocker has problems, you can sometimes recover the data using a utility called repair-bde. To use this command, you will need a destination drive to which the recovered data can be written, as well as your BitLocker recovery key or recovery password. The basic syntax for this command is:

repair-bde -rk | rp
You must specify the source drive, the destination drive, and either the rk (recovery key) or the rp (recovery password) switch, along with the path to the recovery key or the recovery password. Here are two examples of how to use this utility:

repair-bde c: d: -rk e:\recovery.bek
repair-bde c: d: -rp 111111-111111-111111-111111-111111-111111
9: Tasklist

The tasklist command is designed to provide information about the tasks that are running on a Windows 7 system. At its most basic, you can enter the following command:

tasklist
The tasklist command has numerous optional switches, but there are a couple I want to mention. One is the -m switch, which causes tasklist to display all the DLL modules associated with a task. The other is the -svc switch, which lists the services that support each task. Here’s how they look:

tasklist -m
tasklist -svc
10: Taskkill

The taskkill command terminates a task, either by name (which is referred to as the image name) or by process ID. The syntax for this command is simple. You must follow the taskkill command with -pid (process ID) or -im (image name) and the name or process ID of the task that you want to terminate. Here are two examples of how this command works:

‎10 things to love about Windows 8:

Now that the Windows 8 Developer Preview has been available for a while, it is easier to take a step back and evaluate it without the powerful emotions that strike most people the first time they deal with it. Looking at it from a long-distance perspective, there’s a lot to like about Windows 8, especially if you are ready to cut the cord from an installed desktop application base and transition to Web applications and Windows 8 native applications. Here are 10 things I think are great about Windows 8.

1: It’s designed for tablets and touch

Microsoft is working hard to make Windows 8 work well with tablets and the touch UI paradigm, to the point of alienating traditional desktop users. It remains to be seen how Microsoft will respond to criticism over the Metro UI. But I can tell you that after using a phone with the Metro UI for well over half a year now, I think it is extremely effective for touch, and I would love to have a tablet running Windows 8.

2: Apps “share” data

One of the big changes in the application development model is that native Windows 8 apps (those using the new Metro UI and WinRT API) really do not directly communicate with each other, even through the file system, except via carefully defined interfaces. While this handcuffs developers a bit, it means that when applications do share data, Windows is aware of how they do it and makes it easy. For example, you could have an application that handles images and use it to share the pictures with, say, an application to upload them to Facebook. That unleashes a lot more power for developers because it means that applications from different vendors will work together seamlessly, and the developers do not even have to write anything specific for the application theirs works with.

3: The apps can be integrated into the OS

Just as the applications can “share” with each other, they can do the same thing with Windows itself. Again, this allows some really neat integrations to be done without much work by application makers. You can see things like a new social networking application come out and within weeks, Windows will be able to use your friends who are on it in its contact list, or the pictures can go into your picture gallery. The possibilities are endless.

4: It offers ARM support

While the ARM CPUs may not be for everyone or every purpose, lots of mobile vendors have a deep commitment to that platform and understand it well. The ARM devices will not be able to run legacy Windows applications, but they will run the Windows 8 native apps without a hitch. That’s great news for hardware makers, software developers, and users.

5: It beefs up security

The new programming model for Windows 8 native applications is extraordinarily secure. While I am sure that exploits will be found, it will be difficult for the native applications to break free of their chains. Microsoft has really flipped it around. Instead of allowing everything and slowly adding restrictions over the years (and breaking applications in the process, like XP SP2 and Vista did), it’s starting from an “allow nothing” stance.

6: App markets will benefit developers and users

Application markets are nothing new. Even Vista had one (although no one seems to remember it). With Windows 8 native applications, Microsoft is making the application market the primary way of getting apps onto the computer, much like Windows Phone 7. That’s great news for developers who need to get some more visibility for their applications and who do not want to deal with payments processing and such, especially for low-priced apps. And the application market is great for users, too. As we’ve seen, app markets encourage lower prices, and Microsoft will surely apply the same strict quality control that it has to the Windows Phone 7 app market.

7: System restore is easier

Microsoft has built new utilities into Windows 8 that makes it much easier than ever to send the system back to “out of the box,” while preserving your data. Providing a more appliance-like experience is critical for the typical user, and the help desk will appreciate it too.

8: Cloud sync is everywhere

While not everyone is in love with the cloud as an idea, Windows 8 has great facilities for allowing applications and users to automatically sync data between devices using the cloud. That’s great for users who can seamlessly transition between their tablet and desktop PC (and perhaps their phone), as well as for tech support, who can just replace a broken device instead of worrying about data loss.

9: It offers simplified administration and configuration

The Control Panel has been stripped down to the bare essentials, and you can’t even think about tasks like registry editing, defragging, etc., from the Metro UI. (You can do these tasks through the legacy desktop, if needed, but that won’t work for ARM devices.) Throughout Windows 8, a primary theme has been giving the user a more appliance-like “It just works” experience. Power users might howl about it, but the truth is, the Windows experience is still far more complex than the average user wants to deal with. Windows 8 is a great move in the right direction for those users.

10: System stability is improved

Windows 7 has really set the standard for system reliability. Short of hardware or driver problems, the old blue screen of death is almost never seen anymore. Windows 8 takes this to the next level. The same changes to the application development model also improve system stability. Applications can’t run over each other’s data easily, and the new WinRT API just does not allow the kinds of shenanigans that have caused unstable systems over the years. If you stick with native Windows 8 applications, reboots (other than for patching) and crashes should be extraordinarily rare.