Wednesday, 3 September 2014

Why was the Windows XP lifespan so long?

Strong business customer influence

Corporations and small business love predictability. They tend to try to minimize big changes, at least at the operational level and gladly muddle through with the status quo, particularly if it's good enough to get the job done reliably.

Over the past dozen years, Microsoft did a pretty good job providing timely updates for old reliable XP, while at the same time exploring new user interface (UI) designs and the latest multiprocessor architectures as versions progressed up through Windows 8.

Performance was a hotly contested issue a decade ago, but not so much anymore. Business people buy machines to accomplish work and get paid. Sticking with Windows XP features made sense for a lot of small and midsize businesses because they matured quickly and updates didn't change the way people worked.

Windows XP also proved to be great for uneventful rollouts. IT techs liked the fact that the OS could run multiple versions of the same application without much worry that one would break another. For some reason, if a new version of some app died, you'd always have a fallback position.

While Windows XP features didn't change that much, hardware has rapidly improved.

New hardware

Through the mid-2000s, hardware made big leaps, quickly moving to 64-bit and multi-core processing. Windows XP ran well on these new machines, taking advantage of the higher clock speeds and larger memories.

Support staffers liked the fact that the underlying design of XP and its UI remained unchanged while becoming more stable over time. Any trouble spots became well-documented and fixes were readily available -- everybody knew how to fix Windows XP problems.

Over the past six or seven years, hardware capabilities have gone by the wayside. Most people aren't very impressed with a 3.0 GHz clock speed or having 8 GB of RAM. The corporate and business people want stability and a good return on investment, which means buying hardware and keeping it for as long as it can do the job well. The hardware has developed to such a state as to be both fast and reliable.

Half a decade ago, companies realized that they could lengthen the replacement cycle because the quality of the machines always seemed to get better and XP wasn't changing radically.

Sadly, the economy took a nosedive in 2008 and hasn't yet recovered. Companies continue to downsize, budgets have shrunk and everybody has had to tighten their belts. Of course, that situation is all the more reason for many enterprises to put off purchases of the latest hardware and the associated version of Windows.

Easy money

Microsoft realized years ago that not everybody wanted the latest and greatest products all the time. Business users especially wanted reliable, repeatable results without surprises.

So why fix something that isn't broken? And, as long as Windows XP and the fully-depreciated desktop still did the job, why give up a good thing?

In addition, Microsoft Office worked in perfect harmony with the OS which was another reason for the long Windows XP lifespan. Sure, there have been some upgrades over the years, but if you used Office applications 10 years ago, you can certainly pick it up and use it very effectively today.

What's next?

Companies are now faced with some computing challenges. They can upgrade their hardware and broker deals for newer versions of Windows or hold on to XP, without Microsoft's official support, until the hardware stops working.

Lot of companies are examining tablets and smartphone-based apps as a way to transition away from the traditional fixed desktop system. There's a lot of interest in, as well as fear of, bring your own device practices. The market seems to still be up in the air on how this will play out, though, especially in the areas of standard office-based computing tasks, point-of-sale applications and systems such as automated teller machines.

So, it will be an interesting time ahead as companies move off of XP and onto the newer versions of Windows. Many will successfully evolve over to cloud-connected tablets and smartphones, while some will move to Linux- or Apple-based notebooks.


Even as technology relentlessly moves on, the end of Windows XP support is a milestone as we step into the future.

Sunday, 3 August 2014

SMAC

SMAC (social, mobile, analytics and cloud) is the concept that four technologies are currently driving business innovation.

SMAC creates an ecosystem that allows a business to improve its operations and get closer to the customer with minimal overhead and maximum reach. The proliferation of structured and unstructured data that is being created by mobile devices, sensors, social media, loyalty card programs and website browsing is creating new business models built upon customer-generated data. None of the four technologies can be an afterthought because it's the synergy created by social, mobile, analytics and cloud working together that creates a competitive advantage.

Social media has provided businesses with new ways to reach and interact with customers, while mobile technologies have changed the way people communicate, shop and work. Analytics allow businesses to understand how, when and where people consume certain goods and services and cloud computing provides a new way to access technology and the data a business needs to quickly respond to changing markets and solve business problems. While each of the four priorities can impact a business individually, their convergence is proving to be a disruptive force that is creating entirely new business models for service providers.

The integration of social, mobile, analytics and cloud requires clear policies and guidelines as well as management tools that can automate business processes. The media company Netflix is often cited as an example of a business that has successfully harnessed the power of SMAC. For example, when a Netflix member streams a TV show from the Netflix cloud to their iPad, they are given the option of signing into Neflix with Facebook's social login.

After viewing a show, members are given multiple ways to provide social feedback. They can rate content with stars, write reviews and/or share what they just watched with friends on Facebook or Twitter. Customer data is stored in the cloud and Netflix can break down its analysis to such a granular a level that its recommendation engine can personalize suggestions for individual family members who share the same account, a concept known as 1:1 marketing.


Proponents of SMAC as a customer relationship management (CRM) strategy believe that 1:1 marketing (also called one-off marketing) should be the ultimate goal of every SMAC initiative. Critics worry that 1:1 marketing initiatives that aggregate customer data from disparate sources, especially data that is purchased from data brokers, may violate customer privacy and cause legal problems related to compliance and data sovereignty.

Friday, 11 July 2014

ROOTING ANDROID DEVICES

Users root their Android devices because it gives them unlimited access, but system administrators can root devices, too, then analyze everything that's happening on the phone. Rooting Android devices isn't for everyone, however.

Before you even start thinking about rooting Android devices, you should know that no vendor builds devices that are easy to root -- and for good reason. If users have complete access, they may change settings or permissions that cause their devices to stop working. And whether you or your users hack into devices, it will void the warranty on the device, the owner will lose support from the manufacturer, and in the worst case, it could cause the device to lose functionality completely. Some devices are even programmed to reset themselves to the original operating system upon rebooting, which means that you'll need to root the device over and over again.

To root or not to root?
So should you root Android devices? It depends.
If you need to thoroughly analyze app behavior on devices or build Android apps, it might make sense for you to root one device to test or build the apps before deploying them to all users. In this case, you could root an Android smartphone that you use only for development and testing. If neither of these situations apply to you, however, you're better of letting users' Android devices run their original operating systems

Rooting requirements

If you do decide to go for it, just know that rooting Android devices requires a ROM that is specifically made for the device you're trying to root. You upload that ROM to the device to replace the default operating system, but there are many different Android devices and OS versions, and even small hardware changes make it so a ROM for a similar device won't work on the exact device you're rooting. If you're planning to root a device so you can build and test applications, make sure to choose a device that has a ROM available.

You should also use a ROM that is from a trusted origin. It's not a good idea to just download any ROM from the Internet, because the ROM maker could have put anything in it, such as backdoors or other security holes. To avoid this problem, use a ROM from a source that is open enough to be verified, such as CyanogenMod.

In addition to getting the right ROM, you'll also need to prepare the device itself. Typically, that involves accessing the Settings menu on the device and selecting the USB debugging option from the Developer options. This allows you to work on the device when it is connected to the computer.

You also need to prepare a computer so that it can see the device. For a Windows computer, that means installing USB drivers for the specific Android device you're trying to root. If you're using a Linux computer, you'll have to create udev rules, but that is a fairly complicated task.

Once the computer is ready, you need to connect the device to the computer via USB, then load the ROM image onto the device. This is the all-or-nothing moment: If the root attempt fails, the phone will be "bricked" and you won't be able to do anything with it anymore. From a corporate perspective, the main problem with this procedure is not the risk of bricking a phone, but the absence of a good source for the ROMs that can be used for rooting.


If the process works, the phone will be rooted and you'll get complete access to it. To work in the root environment, you'll need a remote control tool such as the Android Debug Bridge developer tools. Alternatively, you could install an app on the rooted phone -- a shell environment or a full-featured app such as TWRP Manager -- that allows you to work on the device with root access.

Wednesday, 11 June 2014

MOBILE INSTANT MESSAGING

Is mobile instant messaging the next big thing in the enterprise? Along with the proliferation of enterprise mobile apps, the answer could be yes.

By 2016, 4.3 billion people will have an email account -- and the same number of people should have an instant messaging (IM) account. Still, it is hard to say that mobile IM will completely take over mobile email.

Why is enterprise IT concerned about the rise of mobile instant messaging? Any written communication has the chance to expose the company to the risk of losing data. Judging by the sheer amount of free options for instant messaging, you can see how IT is worried that it cannot control company-related conversations happening on mobile devices.

Who are the mobile IM players?
You can put mobile instant messaging into four categories: third-party (free or mostly free apps), native messaging, enterprise mobility management and company infrastructure tools.

Third-party or free apps consist of apps such as Facebook, Snapchat, Viber and WhatsApp. This category also consists of new players that are focused on the enterprise, such as Tigertext and Cotap.

Native apps consist of the native instant messaging that comes with your mobile device, such as BBM for Blackberry, Messages for Apple and Google talk for Android.

Enterprise mobility management tools have some of the same characteristics as Tigertext and Cotap in that they encrypt their IM strings and keep the company instant messaging sessions separate from the native device instant messaging sessions.

Company infrastructure tools now have mobile apps to accompany them, too. Tools such as Lync, Yammer and IBM Sametime have apps that support their instant messaging infrastructure. These apps have traditionally offered far less of a usable experience than their desktop counterparts but are getting better with each upgrade.

Do users want mobile instant messaging?
Just like breaking down use of a tablet vs. smartphone vs. laptop, different communication methods have different purposes. If you want a more formal communication that reaches many people, with attachments and a way to let the recipient read the message on their own terms, then email is your method. If you want a quick conversation that warrants a quick response, then IM is your method. Add the fact that Millennials may be 30-50% of your workforce in a few years, and mobile IM may be the standard form of communication in the future.

So would employees of your company even use the enterprise-provided instant messaging option, or would they just use their native messaging platform or an app like Snapchat or Viber?

This is the same argument as the one regarding company-offered productivity apps vs. someone’s personal productivity apps. As an enterprise, if you offer a tool that is easy to use and you provide the proper training, your employees will be more likely to use these tools. Sometimes, instant messaging does not fall under a guideline or policy, nor do users know if they are even using the company-provided tool.

Several of my co-workers have contacts set up in either their company EMM contact list or their native contact list. When they select a name to send an IM, they gravitate toward whatever is easier for them. Usually, it ends up being the native client because they find that more natural.

Do users need an enterprise mobile instant messaging tool? Yes. Will enterprises adopt a tool outside their EMM or infrastructure options? No. The consumerization of IT is still very new for most organizations, and they are focused on managing devices, email and apps. Most organizations will wait for Microsoft and the EMM tools to keep improving their apps to provide an IM tool for their employees. Otherwise, a large breach resulting in data loss will spur on optimal tools for enterprise mobile instant messaging.

Thursday, 8 May 2014

MOBILE DEVICE MANAGEMENT

What is the market Definition or Description of Mobile Device Management?

Enterprise mobile device management (MDM) software is:

(1)  A policy and configuration management tool for mobile handheld devices (smartphones and tablets based on smartphone OSs), and

(2)  An enterprise mobile solution for securing and enabling enterprise users and content. It helps enterprises manage the transition to a more complex mobile computing and communications environment by supporting security, network services, and software and hardware management across multiple OS platforms and now sometimes laptop and ultrabooks. This is especially important as bring your own device (BYOD) initiatives and advanced wireless computing becomes the focus of many enterprises. MDM can support corporate-owned as well as personal devices, and helps support a more complex and heterogeneous environment.

Criteria to consider when choosing an MDM solution:

Internal resources for management — Most MDM purchases are 500 devices or fewer. The size of the company doesn't really matter here as much as the internal resource capabilities to manage devices.

Complexity of data — Gartner's position is that any enterprise data needs to be protected and managed. MDM is a start, by enforcing enterprise policy around encryption and authentication.  Containers should be used to manage email and other mobile content, like file sharing, or enterprise apps, like sales force automation (SFA). These are also delivered by MDM vendors.

Cross-platform needs - More than ever, companies will begin to support multiple OSs. Although today Apple dominates smartphone sales in the enterprise, users will want to bring a variety of other devices to work that MDM providers can manage in an integrated fashion. Once your company has such a diverse environment, MDM becomes a necessity.

Delivery — Companies need to decide on whether they want MDM on-premises or in a SaaS/cloud model. SMBs prefer the SaaS model because it reduces the cost and total cost of ownership, based on having hardware to support fewer users. Large companies that are comfortable with the cloud model, usually in non-regulated markets, also are moving toward SaaS. In a global, highly distributed environment, they also like the appeal of the reduction in hardware and server management that cloud brings, versus on-premises servers. MDM managed services are also emerging, but are currently limited in scope and adopt.

Caution/factors before moving onto MDM:

Most companies started out using EAS to manage their devices, but found it lacking in the following areas, which pushed them to purchase a more complete MDM suite:

Volume of devices: It is difficult to manage a larger volume of devices on EAS. Once companies got to more than 500 devices, they typically looked for a more complete MDM suite.

Mix of platforms: Companies that had two or more mobile OS platforms to manage found it difficult to do so on EAS.

Granular support/policy: More complete MDM systems offer a deeper management capability, with more-detailed policies. For example, EAS allows passwords to be enforced (depending on the mobile OS), but more-comprehensive MDM systems allow more flexibility in the password type, length and complexity.

Reporting: EAS is very weak on device reporting. Companies that wanted better reporting moved to more complete MDM systems.

Ability to block certain device platforms: Companies may want to restrict the types of mobile OSs they will support.

Need to identify rooted/jail broken devices: There is concern over rooted or jail broken devices because companies cannot control their data if devices are compromised.


Advanced capabilities to manage mobile apps: Application provisioning and updating are important to companies today.

Wednesday, 9 April 2014

7Vs OF BIG DATA

Big Data is a big thing. It will change our world completely and is not a passing fad that will go away. To understand the phenomenon that is big data, it is often described using seven Vs: Volume, Velocity, Variety, Veracity, Value, Versatility and Validity.

Volume refers to the vast amounts of data generated every second. Just think of all the emails, twitter messages, photos, video clips, sensor data etc. we produce and share every second. We are not talking Terabytes but Zettabytes or Brontobytes. On Facebook alone we send 10 billion messages per day, click the "like' button 4.5 billion times and upload 350 million new pictures each and every day. If we take all the data generated in the world between the beginning of time and 2008, the same amount of data will soon be generated every minute! This increasingly makes data sets too large to store and analyse using traditional database technology. With big data technology we can now store and use these data sets with the help of distributed systems, where parts of the data is stored in different locations and brought together by software.

Velocity refers to the speed at which new data is generated and the speed at which data moves around. Just think of social media messages going viral in seconds, the speed at which credit card transactions are checked for fraudulent activities, or the milliseconds it takes trading systems to analyse social media networks to pick up signals that trigger decisions to buy or sell shares. Big data technology allows us now to analyse the data while it is being generated, without ever putting it into databases.

Variety refers to the different types of data we can now use. In the past we focused on structured data that neatly fits into tables or relational databases, such as financial data (e.g. sales by product or region). In fact, 80% of the world’s data is now unstructured, and therefore can’t easily be put into tables (think of photos, video sequences or social media updates). With big data technology we can now harness differed types of data (structured and unstructured) including messages, social media conversations, photos, sensor data, video or voice recordings and bring them together with more traditional, structured data.

Veracity refers to the messiness or trustworthiness of the data. With many forms of big data, quality and accuracy are less controllable (just think of Twitter posts with hash tags, abbreviations, typos and colloquial speech as well as the reliability and accuracy of content) but big data and analytics technology now allows us to work with these type of data. The volumes often make up for the lack of quality or accuracy.

Value: Then there is another V to take into account when looking at Big Data: Value! It is all well and good having access to big data but unless we can turn it into value it is useless. So you can safely argue that 'value' is the most important V of Big Data. It is important that businesses make a business case for any attempt to collect and leverage big data. It is so easy to fall into the buzz trap and embark on big data initiatives without a clear understanding of costs and benefits.

Versatility: This explains how the data can be used and states their capabilities.

Validity: Possible to give both a scale of data range and prove of data

Wednesday, 5 March 2014

ZERO CLIENT Vs THIN CLIENT

WHILE THE TERM zero client is something of a marketing buzzword, it is a useful way of differentiating options for the devices that are used to access desktops. A zero client is similar to a thin client in its purpose—accessing a desktop in a data center—but requires a lot less configuration.

Zero clients tend to be small and simple devices with a standard set of features that support the majority of users. They also tend to be dedicated to one data center desktop product and remote display protocol. Typically, configuration is simple—a couple of dozen settings at the most, compared to the thousands of settings you see in a desktop operating system. Zero clients load their simple configuration from the network every time they are powered on; the zero clients at a site will all be the same. Zero clients support access to a variety of desktop types, terminal services, virtual desktop infrastructure (VDI) or dedicated rack mount or blade workstations.

The basic premise of zero clients is that the device on the user’s desk doesn’t have any persistent configuration. Instead, it learns how to provide access to the desktop from the network every time it starts up. This gives a lot of operational benefits, since the zero client devices are never unique. This contrasts with a thin client, which may have local applications installed and will hold its configuration on persistent storage in the device.

Thin clients became a mainstream product class shortly after Microsoft introduced Windows terminal Server and Citrix launched MetaFrame, both in 1998. To enter this market, PC manufacturers cut down their desktop hardware platforms. They repurposed their PC management tools, reusing as much technology as possible from existing PC business. This meant that a fairly customized Windows or Linux setup could be oriented toward being a thin client.

Over time optional features for USB redirection, a local Web browser, VOIP integration agents and multi-monitor display support were added. Each additional feature adds configuration and complexity to the thin client. After a few years, thin clients are really small PCs. Some even have PCI or PC Card slots added. These thicker thin clients get quite close to a full PC in terms of capabilities and complexity. Instead of simplifying management, IT administrators now needed to manage the device on the user’s desk as well as in the data center. Zero clients, then, are a return to the simpler devices on user’s desks—with simpler management.

Zero clients are much simpler to manage, configure and update. Zero client firmware images are a few megabytes, compared with the multiple gigabytes that thin client operating systems take up. The update process itself is much quicker and less intrusive on a zero client, possibly occurring every day when the client boots.

Thin clients need to be patched and updated as often as the desktop operating system they carry; since zero clients have no operating system, they need less frequent updates. Zero clients have few knobs and switches to turn—probably fewer than 100 configuration items in total—so they are simple to manage. Often, their bulk management is a couple of text files on a network share. Thin clients have a whole operating system to manage, with tens of thousands of settings necessitating complex management applications, usually on dedicated servers at multiple sites. A zero client is like a toaster. A consumer can take it out of its packaging and make it work. If the consumer is an employee at a remote branch, there are benefits to having that worker be able to deploy a new terminal. Sometimes, thin clients need special builds or customized settings applied to them before they are deployed. This obviously is not ideal for rapid deployment. The ability to rapidly scale can be important when it comes to something like opening a call center to accommodate an advertising campaign or a natural disaster response. Zero clients have lower power consumption. Thin clients have mainstream CPUs and often graphics processing units, but a zero client usually has a low-power CPU (or none at all), which cuts down on power consumption and heat generation.


The simplicity of zero clients also makes for a much smaller attack surface, so placing them in less trusted networks is not so worrying. Also, putting them in physically hostile locations is safe; lower power and usually passive cooling mean that heat, dust and vibration are less likely to cause maintenance problems. Zero clients are all the same. Models are released every couple of years rather than every few months, so your fleet will contain fewer models. That means there’s no need for help desk calls to move a device from one desk to another. Plus the user experience is consistent. Your supplier’s inventory of zero clients will also have fewer models, which should lead to better availability when you need new zero clients.