Wednesday 10 December 2014

SIEM

Security information and event management (SIEM) is an approach to security management that seeks to provide a holistic view of an organization's information technology (IT) security. The acronym is pronounced "sim" with a silent "e."

The underlying principle of a SIEM system is that relevant data about an enterprise's security is produced in multiple locations and being able to look at all the data from a single point of view makes it easier to spot trends and see patterns that are out of the ordinary.

SIEM combines SIM (security information management) and SEM (security event management) functions into one security management system. A SEM system centralizes the storage and interpretation of logs and allows near real-time analysis which enables security personnel to take defensive actions more quickly. A SIM system collects data into a central repository for trend analysis and provides automated reporting.

By bringing these two functions together, SIEM systems provide quicker identification, analysis and recovery of security events. They also allow compliance managers to confirm they are fulfilling an organization's legal compliance requirements. SIEM systems collect logs and other security-related documentation for analysis. Most SIEM systems work by deploying multiple collection agents in a hierarchical manner to gather security-related events from end-user devices, servers, network equipment -- and even specialized security equipment like firewalls, antivirus or intrusion prevention systems. The collectors forward events to a centralized management console, which performs inspections and flags anomalies. To allow the system to identify anomalous events, it’s important that the SIEM administrator first creates a profile of the system under normal event conditions.

At the most basic level, a SIEM system can be rules-based or employ a statistical correlation engine to establish relationships between event log entries. In some systems, pre-processing may happen at edge collectors, with only certain events being passed through to a centralized management node. In this way, the volume of information being communicated and stored can be reduced. The danger of this approach, however, is that relevant events may be filtered out too soon.


SIEM systems are typically expensive to deploy and complex to operate and manage. While Payment Card Industry Data Security Standard (PCI DSS) compliance has traditionally driven SIEM adoption in large enterprises, concerns over advanced persistent threats (APTs) have led smaller organizations to look at the benefits a SIEM managed security service provider (MSSP) can offer.

Wednesday 19 November 2014

MOBILE APP DELIVERY

There's no shortage of mobile app delivery approaches to help business users get real work done on their smartphones and tablets.

Virtualization, application refactoring and enterprise app stores are all potential options, but IT pros must consider the cost, complexity and user-friendliness of each before making any decisions. Delivering Windows applications to mobile devices may be easy -- especially in shops that already use desktop or application virtualization -- but it doesn't always make for a great user experience. Native mobile apps are easier to use, but building, buying and deploying them can get tricky.

A good mobile app delivery strategy helps users do their jobs better and eases IT’s management burdens. App stores, Web and cloud apps and virtualization are some of IT’s options.

As more employees bring mobile devices and apps into the workplace, IT has several mobile app delivery and management options to consider.

One of the biggest risks of the consumerization of IT is that mobile device users exchange and store sensitive enterprise data without the necessary oversight. IT can limit these risks by controlling mobile app delivery, management and security.

Desktop and application virtualization are often the first technologies IT pros turn to when they need to deliver legacy software to mobile devices. Virtualization streams Windows applications -- which are designed for mouse-and-keyboard interfaces -- to mobile devices, which have touchscreens. As such, it may not provide the greatest user experience compared to apps that have been developed from the ground up for mobile. Refactoring could emerge as a beneficial middle ground.

There are four approaches worth considering:

1. Mobile app delivery with enterprise app stores

2. Using Web apps for mobile app delivery

3. Mobile app delivery via cloud

4. Mobile desktop virualization

Monday 13 October 2014

HYBRID CLOUD COMPUTING

What is Hybrid Cloud Computing?

Cloud computing has evolved in recent years. The new world of the hybrid cloud is an environment that employs both private and public cloud services. Companies are realizing that they need many different types of cloud services in order to meet a variety of customer needs.

The growing importance of hybrid cloud environments is transforming the entire computing industry as well as the way businesses are able to leverage technology to innovate. Economics and speed are the two greatest issues driving this market change.

There are two primary deployment models of clouds: public and private. Most organizations will use a combination of private computing resources (data centers and private clouds) and public services, where some of the services existing in these environments touch each other — this is the hybrid cloud environment.

The public cloud

The public cloud is a set of hardware, networking, storage, services, applications, and interfaces owned and operated by a third party for use by other companies or individuals. These commercial providers create a highly scalable data center that hides the details of the underlying infrastructure from the consumer.

Public clouds are viable because they typically manage relatively repetitive or straightforward workloads. For example, electronic mail is a very simple application. Therefore, a cloud provider can optimize the environment so that it is best suited to support a large number of customers, even if they save many messages.

Public cloud providers offering storage or computing services optimize their computing hardware and software to support these specific types of workloads. In contrast, the typical data center supports so many different applications and so many different workloads that it cannot be optimized easily.

The private cloud

A private cloud is a set of hardware, networking, storage, services, applications, and interfaces owned and operated by an organization for the use of its employees, partners, and customers. A private cloud can be created and managed by a third party for the exclusive use of one enterprise.

The private cloud is a highly controlled environment not open for public consumption. Thus, a private cloud sits behind a firewall. The private cloud is highly automated with a focus on governance, security, and compliance.

Automation replaces more manual processes of managing IT services to support customers. In this way, business rules and processes can be implemented inside software so that the environment becomes more predictable and manageable.

The hybrid cloud

A hybrid cloud is a combination of a private cloud combined with the use of public cloud services where one or several touch points exist between the environments. The goal is to combine services and data from a variety of cloud models to create a unified, automated, and well-managed computing environment.

Combining public services with private clouds and the data center as a hybrid is the new definition of corporate computing. Not all companies that use some public and some private cloud services have a hybrid cloud. Rather, a hybrid cloud is an environment where the private and public services are used together to create value.

A cloud is hybrid

If a company uses a public development platform that sends data to a private cloud or a data center–based application.

When a company leverages a number of SaaS (Software as a Service) applications and moves data between private or data center resources.

A cloud is not hybrid

If a few developers in a company use a public cloud service to prototype a new application that is completely disconnected from the private cloud or the data center.

If a company is using a SaaS application for a project but there is no movement of data from that application into the company’s data center.

Wednesday 3 September 2014

Why was the Windows XP lifespan so long?

Strong business customer influence

Corporations and small business love predictability. They tend to try to minimize big changes, at least at the operational level and gladly muddle through with the status quo, particularly if it's good enough to get the job done reliably.

Over the past dozen years, Microsoft did a pretty good job providing timely updates for old reliable XP, while at the same time exploring new user interface (UI) designs and the latest multiprocessor architectures as versions progressed up through Windows 8.

Performance was a hotly contested issue a decade ago, but not so much anymore. Business people buy machines to accomplish work and get paid. Sticking with Windows XP features made sense for a lot of small and midsize businesses because they matured quickly and updates didn't change the way people worked.

Windows XP also proved to be great for uneventful rollouts. IT techs liked the fact that the OS could run multiple versions of the same application without much worry that one would break another. For some reason, if a new version of some app died, you'd always have a fallback position.

While Windows XP features didn't change that much, hardware has rapidly improved.

New hardware

Through the mid-2000s, hardware made big leaps, quickly moving to 64-bit and multi-core processing. Windows XP ran well on these new machines, taking advantage of the higher clock speeds and larger memories.

Support staffers liked the fact that the underlying design of XP and its UI remained unchanged while becoming more stable over time. Any trouble spots became well-documented and fixes were readily available -- everybody knew how to fix Windows XP problems.

Over the past six or seven years, hardware capabilities have gone by the wayside. Most people aren't very impressed with a 3.0 GHz clock speed or having 8 GB of RAM. The corporate and business people want stability and a good return on investment, which means buying hardware and keeping it for as long as it can do the job well. The hardware has developed to such a state as to be both fast and reliable.

Half a decade ago, companies realized that they could lengthen the replacement cycle because the quality of the machines always seemed to get better and XP wasn't changing radically.

Sadly, the economy took a nosedive in 2008 and hasn't yet recovered. Companies continue to downsize, budgets have shrunk and everybody has had to tighten their belts. Of course, that situation is all the more reason for many enterprises to put off purchases of the latest hardware and the associated version of Windows.

Easy money

Microsoft realized years ago that not everybody wanted the latest and greatest products all the time. Business users especially wanted reliable, repeatable results without surprises.

So why fix something that isn't broken? And, as long as Windows XP and the fully-depreciated desktop still did the job, why give up a good thing?

In addition, Microsoft Office worked in perfect harmony with the OS which was another reason for the long Windows XP lifespan. Sure, there have been some upgrades over the years, but if you used Office applications 10 years ago, you can certainly pick it up and use it very effectively today.

What's next?

Companies are now faced with some computing challenges. They can upgrade their hardware and broker deals for newer versions of Windows or hold on to XP, without Microsoft's official support, until the hardware stops working.

Lot of companies are examining tablets and smartphone-based apps as a way to transition away from the traditional fixed desktop system. There's a lot of interest in, as well as fear of, bring your own device practices. The market seems to still be up in the air on how this will play out, though, especially in the areas of standard office-based computing tasks, point-of-sale applications and systems such as automated teller machines.

So, it will be an interesting time ahead as companies move off of XP and onto the newer versions of Windows. Many will successfully evolve over to cloud-connected tablets and smartphones, while some will move to Linux- or Apple-based notebooks.


Even as technology relentlessly moves on, the end of Windows XP support is a milestone as we step into the future.

Sunday 3 August 2014

SMAC

SMAC (social, mobile, analytics and cloud) is the concept that four technologies are currently driving business innovation.

SMAC creates an ecosystem that allows a business to improve its operations and get closer to the customer with minimal overhead and maximum reach. The proliferation of structured and unstructured data that is being created by mobile devices, sensors, social media, loyalty card programs and website browsing is creating new business models built upon customer-generated data. None of the four technologies can be an afterthought because it's the synergy created by social, mobile, analytics and cloud working together that creates a competitive advantage.

Social media has provided businesses with new ways to reach and interact with customers, while mobile technologies have changed the way people communicate, shop and work. Analytics allow businesses to understand how, when and where people consume certain goods and services and cloud computing provides a new way to access technology and the data a business needs to quickly respond to changing markets and solve business problems. While each of the four priorities can impact a business individually, their convergence is proving to be a disruptive force that is creating entirely new business models for service providers.

The integration of social, mobile, analytics and cloud requires clear policies and guidelines as well as management tools that can automate business processes. The media company Netflix is often cited as an example of a business that has successfully harnessed the power of SMAC. For example, when a Netflix member streams a TV show from the Netflix cloud to their iPad, they are given the option of signing into Neflix with Facebook's social login.

After viewing a show, members are given multiple ways to provide social feedback. They can rate content with stars, write reviews and/or share what they just watched with friends on Facebook or Twitter. Customer data is stored in the cloud and Netflix can break down its analysis to such a granular a level that its recommendation engine can personalize suggestions for individual family members who share the same account, a concept known as 1:1 marketing.


Proponents of SMAC as a customer relationship management (CRM) strategy believe that 1:1 marketing (also called one-off marketing) should be the ultimate goal of every SMAC initiative. Critics worry that 1:1 marketing initiatives that aggregate customer data from disparate sources, especially data that is purchased from data brokers, may violate customer privacy and cause legal problems related to compliance and data sovereignty.

Friday 11 July 2014

ROOTING ANDROID DEVICES

Users root their Android devices because it gives them unlimited access, but system administrators can root devices, too, then analyze everything that's happening on the phone. Rooting Android devices isn't for everyone, however.

Before you even start thinking about rooting Android devices, you should know that no vendor builds devices that are easy to root -- and for good reason. If users have complete access, they may change settings or permissions that cause their devices to stop working. And whether you or your users hack into devices, it will void the warranty on the device, the owner will lose support from the manufacturer, and in the worst case, it could cause the device to lose functionality completely. Some devices are even programmed to reset themselves to the original operating system upon rebooting, which means that you'll need to root the device over and over again.

To root or not to root?
So should you root Android devices? It depends.
If you need to thoroughly analyze app behavior on devices or build Android apps, it might make sense for you to root one device to test or build the apps before deploying them to all users. In this case, you could root an Android smartphone that you use only for development and testing. If neither of these situations apply to you, however, you're better of letting users' Android devices run their original operating systems

Rooting requirements

If you do decide to go for it, just know that rooting Android devices requires a ROM that is specifically made for the device you're trying to root. You upload that ROM to the device to replace the default operating system, but there are many different Android devices and OS versions, and even small hardware changes make it so a ROM for a similar device won't work on the exact device you're rooting. If you're planning to root a device so you can build and test applications, make sure to choose a device that has a ROM available.

You should also use a ROM that is from a trusted origin. It's not a good idea to just download any ROM from the Internet, because the ROM maker could have put anything in it, such as backdoors or other security holes. To avoid this problem, use a ROM from a source that is open enough to be verified, such as CyanogenMod.

In addition to getting the right ROM, you'll also need to prepare the device itself. Typically, that involves accessing the Settings menu on the device and selecting the USB debugging option from the Developer options. This allows you to work on the device when it is connected to the computer.

You also need to prepare a computer so that it can see the device. For a Windows computer, that means installing USB drivers for the specific Android device you're trying to root. If you're using a Linux computer, you'll have to create udev rules, but that is a fairly complicated task.

Once the computer is ready, you need to connect the device to the computer via USB, then load the ROM image onto the device. This is the all-or-nothing moment: If the root attempt fails, the phone will be "bricked" and you won't be able to do anything with it anymore. From a corporate perspective, the main problem with this procedure is not the risk of bricking a phone, but the absence of a good source for the ROMs that can be used for rooting.


If the process works, the phone will be rooted and you'll get complete access to it. To work in the root environment, you'll need a remote control tool such as the Android Debug Bridge developer tools. Alternatively, you could install an app on the rooted phone -- a shell environment or a full-featured app such as TWRP Manager -- that allows you to work on the device with root access.

Wednesday 11 June 2014

MOBILE INSTANT MESSAGING

Is mobile instant messaging the next big thing in the enterprise? Along with the proliferation of enterprise mobile apps, the answer could be yes.

By 2016, 4.3 billion people will have an email account -- and the same number of people should have an instant messaging (IM) account. Still, it is hard to say that mobile IM will completely take over mobile email.

Why is enterprise IT concerned about the rise of mobile instant messaging? Any written communication has the chance to expose the company to the risk of losing data. Judging by the sheer amount of free options for instant messaging, you can see how IT is worried that it cannot control company-related conversations happening on mobile devices.

Who are the mobile IM players?
You can put mobile instant messaging into four categories: third-party (free or mostly free apps), native messaging, enterprise mobility management and company infrastructure tools.

Third-party or free apps consist of apps such as Facebook, Snapchat, Viber and WhatsApp. This category also consists of new players that are focused on the enterprise, such as Tigertext and Cotap.

Native apps consist of the native instant messaging that comes with your mobile device, such as BBM for Blackberry, Messages for Apple and Google talk for Android.

Enterprise mobility management tools have some of the same characteristics as Tigertext and Cotap in that they encrypt their IM strings and keep the company instant messaging sessions separate from the native device instant messaging sessions.

Company infrastructure tools now have mobile apps to accompany them, too. Tools such as Lync, Yammer and IBM Sametime have apps that support their instant messaging infrastructure. These apps have traditionally offered far less of a usable experience than their desktop counterparts but are getting better with each upgrade.

Do users want mobile instant messaging?
Just like breaking down use of a tablet vs. smartphone vs. laptop, different communication methods have different purposes. If you want a more formal communication that reaches many people, with attachments and a way to let the recipient read the message on their own terms, then email is your method. If you want a quick conversation that warrants a quick response, then IM is your method. Add the fact that Millennials may be 30-50% of your workforce in a few years, and mobile IM may be the standard form of communication in the future.

So would employees of your company even use the enterprise-provided instant messaging option, or would they just use their native messaging platform or an app like Snapchat or Viber?

This is the same argument as the one regarding company-offered productivity apps vs. someone’s personal productivity apps. As an enterprise, if you offer a tool that is easy to use and you provide the proper training, your employees will be more likely to use these tools. Sometimes, instant messaging does not fall under a guideline or policy, nor do users know if they are even using the company-provided tool.

Several of my co-workers have contacts set up in either their company EMM contact list or their native contact list. When they select a name to send an IM, they gravitate toward whatever is easier for them. Usually, it ends up being the native client because they find that more natural.

Do users need an enterprise mobile instant messaging tool? Yes. Will enterprises adopt a tool outside their EMM or infrastructure options? No. The consumerization of IT is still very new for most organizations, and they are focused on managing devices, email and apps. Most organizations will wait for Microsoft and the EMM tools to keep improving their apps to provide an IM tool for their employees. Otherwise, a large breach resulting in data loss will spur on optimal tools for enterprise mobile instant messaging.

Thursday 8 May 2014

MOBILE DEVICE MANAGEMENT

What is the market Definition or Description of Mobile Device Management?

Enterprise mobile device management (MDM) software is:

(1)  A policy and configuration management tool for mobile handheld devices (smartphones and tablets based on smartphone OSs), and

(2)  An enterprise mobile solution for securing and enabling enterprise users and content. It helps enterprises manage the transition to a more complex mobile computing and communications environment by supporting security, network services, and software and hardware management across multiple OS platforms and now sometimes laptop and ultrabooks. This is especially important as bring your own device (BYOD) initiatives and advanced wireless computing becomes the focus of many enterprises. MDM can support corporate-owned as well as personal devices, and helps support a more complex and heterogeneous environment.

Criteria to consider when choosing an MDM solution:

Internal resources for management — Most MDM purchases are 500 devices or fewer. The size of the company doesn't really matter here as much as the internal resource capabilities to manage devices.

Complexity of data — Gartner's position is that any enterprise data needs to be protected and managed. MDM is a start, by enforcing enterprise policy around encryption and authentication.  Containers should be used to manage email and other mobile content, like file sharing, or enterprise apps, like sales force automation (SFA). These are also delivered by MDM vendors.

Cross-platform needs - More than ever, companies will begin to support multiple OSs. Although today Apple dominates smartphone sales in the enterprise, users will want to bring a variety of other devices to work that MDM providers can manage in an integrated fashion. Once your company has such a diverse environment, MDM becomes a necessity.

Delivery — Companies need to decide on whether they want MDM on-premises or in a SaaS/cloud model. SMBs prefer the SaaS model because it reduces the cost and total cost of ownership, based on having hardware to support fewer users. Large companies that are comfortable with the cloud model, usually in non-regulated markets, also are moving toward SaaS. In a global, highly distributed environment, they also like the appeal of the reduction in hardware and server management that cloud brings, versus on-premises servers. MDM managed services are also emerging, but are currently limited in scope and adopt.

Caution/factors before moving onto MDM:

Most companies started out using EAS to manage their devices, but found it lacking in the following areas, which pushed them to purchase a more complete MDM suite:

Volume of devices: It is difficult to manage a larger volume of devices on EAS. Once companies got to more than 500 devices, they typically looked for a more complete MDM suite.

Mix of platforms: Companies that had two or more mobile OS platforms to manage found it difficult to do so on EAS.

Granular support/policy: More complete MDM systems offer a deeper management capability, with more-detailed policies. For example, EAS allows passwords to be enforced (depending on the mobile OS), but more-comprehensive MDM systems allow more flexibility in the password type, length and complexity.

Reporting: EAS is very weak on device reporting. Companies that wanted better reporting moved to more complete MDM systems.

Ability to block certain device platforms: Companies may want to restrict the types of mobile OSs they will support.

Need to identify rooted/jail broken devices: There is concern over rooted or jail broken devices because companies cannot control their data if devices are compromised.


Advanced capabilities to manage mobile apps: Application provisioning and updating are important to companies today.

Wednesday 9 April 2014

7Vs OF BIG DATA

Big Data is a big thing. It will change our world completely and is not a passing fad that will go away. To understand the phenomenon that is big data, it is often described using seven Vs: Volume, Velocity, Variety, Veracity, Value, Versatility and Validity.

Volume refers to the vast amounts of data generated every second. Just think of all the emails, twitter messages, photos, video clips, sensor data etc. we produce and share every second. We are not talking Terabytes but Zettabytes or Brontobytes. On Facebook alone we send 10 billion messages per day, click the "like' button 4.5 billion times and upload 350 million new pictures each and every day. If we take all the data generated in the world between the beginning of time and 2008, the same amount of data will soon be generated every minute! This increasingly makes data sets too large to store and analyse using traditional database technology. With big data technology we can now store and use these data sets with the help of distributed systems, where parts of the data is stored in different locations and brought together by software.

Velocity refers to the speed at which new data is generated and the speed at which data moves around. Just think of social media messages going viral in seconds, the speed at which credit card transactions are checked for fraudulent activities, or the milliseconds it takes trading systems to analyse social media networks to pick up signals that trigger decisions to buy or sell shares. Big data technology allows us now to analyse the data while it is being generated, without ever putting it into databases.

Variety refers to the different types of data we can now use. In the past we focused on structured data that neatly fits into tables or relational databases, such as financial data (e.g. sales by product or region). In fact, 80% of the world’s data is now unstructured, and therefore can’t easily be put into tables (think of photos, video sequences or social media updates). With big data technology we can now harness differed types of data (structured and unstructured) including messages, social media conversations, photos, sensor data, video or voice recordings and bring them together with more traditional, structured data.

Veracity refers to the messiness or trustworthiness of the data. With many forms of big data, quality and accuracy are less controllable (just think of Twitter posts with hash tags, abbreviations, typos and colloquial speech as well as the reliability and accuracy of content) but big data and analytics technology now allows us to work with these type of data. The volumes often make up for the lack of quality or accuracy.

Value: Then there is another V to take into account when looking at Big Data: Value! It is all well and good having access to big data but unless we can turn it into value it is useless. So you can safely argue that 'value' is the most important V of Big Data. It is important that businesses make a business case for any attempt to collect and leverage big data. It is so easy to fall into the buzz trap and embark on big data initiatives without a clear understanding of costs and benefits.

Versatility: This explains how the data can be used and states their capabilities.

Validity: Possible to give both a scale of data range and prove of data

Wednesday 5 March 2014

ZERO CLIENT Vs THIN CLIENT

WHILE THE TERM zero client is something of a marketing buzzword, it is a useful way of differentiating options for the devices that are used to access desktops. A zero client is similar to a thin client in its purpose—accessing a desktop in a data center—but requires a lot less configuration.

Zero clients tend to be small and simple devices with a standard set of features that support the majority of users. They also tend to be dedicated to one data center desktop product and remote display protocol. Typically, configuration is simple—a couple of dozen settings at the most, compared to the thousands of settings you see in a desktop operating system. Zero clients load their simple configuration from the network every time they are powered on; the zero clients at a site will all be the same. Zero clients support access to a variety of desktop types, terminal services, virtual desktop infrastructure (VDI) or dedicated rack mount or blade workstations.

The basic premise of zero clients is that the device on the user’s desk doesn’t have any persistent configuration. Instead, it learns how to provide access to the desktop from the network every time it starts up. This gives a lot of operational benefits, since the zero client devices are never unique. This contrasts with a thin client, which may have local applications installed and will hold its configuration on persistent storage in the device.

Thin clients became a mainstream product class shortly after Microsoft introduced Windows terminal Server and Citrix launched MetaFrame, both in 1998. To enter this market, PC manufacturers cut down their desktop hardware platforms. They repurposed their PC management tools, reusing as much technology as possible from existing PC business. This meant that a fairly customized Windows or Linux setup could be oriented toward being a thin client.

Over time optional features for USB redirection, a local Web browser, VOIP integration agents and multi-monitor display support were added. Each additional feature adds configuration and complexity to the thin client. After a few years, thin clients are really small PCs. Some even have PCI or PC Card slots added. These thicker thin clients get quite close to a full PC in terms of capabilities and complexity. Instead of simplifying management, IT administrators now needed to manage the device on the user’s desk as well as in the data center. Zero clients, then, are a return to the simpler devices on user’s desks—with simpler management.

Zero clients are much simpler to manage, configure and update. Zero client firmware images are a few megabytes, compared with the multiple gigabytes that thin client operating systems take up. The update process itself is much quicker and less intrusive on a zero client, possibly occurring every day when the client boots.

Thin clients need to be patched and updated as often as the desktop operating system they carry; since zero clients have no operating system, they need less frequent updates. Zero clients have few knobs and switches to turn—probably fewer than 100 configuration items in total—so they are simple to manage. Often, their bulk management is a couple of text files on a network share. Thin clients have a whole operating system to manage, with tens of thousands of settings necessitating complex management applications, usually on dedicated servers at multiple sites. A zero client is like a toaster. A consumer can take it out of its packaging and make it work. If the consumer is an employee at a remote branch, there are benefits to having that worker be able to deploy a new terminal. Sometimes, thin clients need special builds or customized settings applied to them before they are deployed. This obviously is not ideal for rapid deployment. The ability to rapidly scale can be important when it comes to something like opening a call center to accommodate an advertising campaign or a natural disaster response. Zero clients have lower power consumption. Thin clients have mainstream CPUs and often graphics processing units, but a zero client usually has a low-power CPU (or none at all), which cuts down on power consumption and heat generation.


The simplicity of zero clients also makes for a much smaller attack surface, so placing them in less trusted networks is not so worrying. Also, putting them in physically hostile locations is safe; lower power and usually passive cooling mean that heat, dust and vibration are less likely to cause maintenance problems. Zero clients are all the same. Models are released every couple of years rather than every few months, so your fleet will contain fewer models. That means there’s no need for help desk calls to move a device from one desk to another. Plus the user experience is consistent. Your supplier’s inventory of zero clients will also have fewer models, which should lead to better availability when you need new zero clients.

Monday 17 February 2014

DATA PRIVACY/PROTECTION

High-profile security failures have made privacy protection a top of-mind issue for many organisations. In several cases, hackers have gained access to online networks and systems, stealing personal customer data such as names, addresses, passwords. The financial costs of these breaches are often significant, ranging from tens of thousands to millions. The damage to a company’s brand and its reputation often costs far more. When we think of cyber risk we tend to think of security breaches, but when we look at it through a privacy lens, the range of risks broadens significantly.

As IT organizations move toward virtualization, cloud computing and IT-as-a-service, data protection will undergo a fundamental shift. The underpinnings of this transformation include a change from one-size-fits-all backup to a data protection offering that matches service levels with application requirements. IT organizations would be wise to bring in outside help to navigate through this transition.

There are several issues that an outside consultant can help manage, including:

ROI: The business justification of data protection as a service – data protection is still viewed as insurance and a quality risk assessment and business impact analysis from an outsider can have a meaningful impact with upper management.

Training and Education: Organizations have an opportunity to re-skill staff and gain increased leverage by developing data protection approaches that free up existing personnel. As discussed, however, new approaches will require new mindsets and existing staff will have to be educated and in some cases re-deployed on other tasks.

Architecture: Data protection is not trivial. Virtualization complicates the process and creates IO storms. Architecting data protection solutions and a services-oriented approach that is efficient and streamlined can be more effectively accomplished with outside help. Don’t be afraid to ask.

Customers want choices and ease of access, which requires them to provide personal information and preferences, businesses want to be able to gather, data mine and share this information efficiently. Certain industries such as financial services and health-care, often draw the most attention in the privacy discussion because of the personal information they possess. However, all industries are affected by privacy and data protection requirements. Confirm the organisation does not have misplaced or invented reliance on third party providers that have access to the organisation's own information or that of its customers. Design and implement robust monitoring and testing of privacy and data protection risks and related controls. Most companies have developed and implemented privacy and data protection programs, yet many of these programs fall short for a variety of reasons, including lack of understanding the risk landscape related to information collections and transmittal, inadequate organisational policies, insufficient training and unverified third party providers, among many others.


The bottom line is data protection is changing from a one-size-fits-all exercise that is viewed as expensive insurance to more of a service-oriented solution that can deliver tangible value to the business by clearly reducing risk at a price that is aligned with business objectives. Understanding data protection in a holistic fashion from backup, recovery, disaster recovery, archiving, and security; and as part of IT-as-a-service is not only good practice, it can be good for your bottom line.

Monday 20 January 2014

NET NEUTRALITY

What is net neutrality?
Net neutrality is an idea derived from how telephone lines have worked since the beginning of the 20th century. In case of a telephone line, you can dial any number and connect to it. It does not matter if you are calling from operator A to operator B. It doesn't matter if you are calling a restaurant or a drug dealer. The operators neither block the access to a number nor deliberately delay connection to a particular number, unless forced by the law. Most of the countries have rules that ask telecom operators to provide an unfiltered and unrestricted phone service.

When the internet started to take off in 1980s and 1990s, there were no specific rules that asked that internet service providers (ISPs) should follow the same principle. But, mostly because telecom operators were also ISPs, they adhered to the same principle. This principle is known as net neutrality. An ISP does not control the traffic that passes its servers. When a web user connects to a website or web service, he or she gets the same speed. Data rate for Youtube videos and Facebook photos is theoretically same. Users can access any legal website or web service without any interference from an ISP.

How did net neutrality shape the internet?

Net neutrality has shaped the internet in two fundamental ways.

One, web users are free to connect to whatever website or service they want. ISPs do not bother with what kind of content is flowing from their servers. This has allowed the internet to grow into a truly global network and has allowed people to freely express themselves. For example, you can criticize your ISP on a blog post and the ISP will not restrict access to that post for its other subscribers even though the post may harm its business.

But more importantly, net neutrality has enabled a level playing field on the internet. To start a website, you don't need lot of money or connections. Just host your website and you are good to go. If your service is good, it will find favour with web users. Unlike the cable TV where you have to forge alliances with cable connection providers to make sure that your channel reaches viewers, on internet you don't have to talk to ISPs to put your website online. This has led to creation Google, Facebook, Twitter and countless other services. All of these services had very humble beginnings. They started as a basic websites with modest resources. But they succeeded because net neutrality allowed web users to access these websites in an easy and unhindered way.

What will happen if there is no net neutrality? 

If there is no net neutrality, ISPs will have the power (and inclination) to shape internet traffic so that they can derive extra benefit from it. For example, several ISPs believe that they should be allowed to charge companies for services like YouTube and Netflix because these services consume more bandwidth compared to a normal website. Basically, these ISPs want a share in the money that YouTube or Netflix make. 

Without net neutrality, the internet as we know it will not exist. Instead of free access, there could be "package plans" for consumers. For example, if you pay Rs 500, you will only be able to access websites based in India. To access international websites, you may have to pay a more. Or maybe there can be different connection speed for different type of content, depending on how much you are paying for the service and what "add-on package" you have bought. 

Lack of net neutrality, will also spell doom for innovation on the web. It is possible that ISPs will charge web companies to enable faster access to their websites. Those who don't pay may see that their websites will open slowly. This means bigger companies like Google will be able to pay more to make access to Youtube or Google+ faster for web users but a startup that wants to create a different and better video hosting site may not be able to do that. 

Will the concept of net neutrality survive?

Net neutrality is sort of gentlemen's agreement. It has survived so far because few people realized the potential of internet when it took off around 30 years ago. But now when the internet is an integral part of the society and incredibly important, ISPs across the world are trying to get the power to shape and control the traffic. But there are ways to keep net neutrality alive. 

Consumers should demand that ISPs continue their hands-off approach from the internet traffic. If consumers see a violation of net neutrality, they ought to take a proactive approach and register their displeasure with the ISP. They should also reward ISPs that uphold the net neutrality