Thursday 19 February 2015

5 BIG TECH TRENDS IN 2015

1. The Internet of Me. This means that our world will be personalized. Everyday objects are going online, creating an abundance of digital channels that reach deep into our lives. Forward-thinking businesses are changing the ways they build new applications, products, and services to engage customers without breaching the customers’ trust.


2. Outcome economy. Intelligent hardware will bridge the digital enterprise and the physical world. As the physical world picks up sensors of all kinds, products will become more meaningful in the results they produce for people. This is no longer about selling things, but selling results in the outcome economy.

3. Platform (R)evolution. Platform-based companies are capturing more of the digital economy’s opportunities for growth and profits. The cloud and mobility are eroding the tech and cost barriers associated with such platforms.

4. Intelligent enterprise. Tech will enable people to make faster decisions. With an influx of big data, software intelligence will make it easier for machines to make better informed decisions.


5. Workforce reimagined. Machines and humans have to be amplified to do more together. Advances in natural interfaces, wearable devices, and smart machines will present new chances for companies to empower their workers through technology.

Wednesday 10 December 2014

SIEM

Security information and event management (SIEM) is an approach to security management that seeks to provide a holistic view of an organization's information technology (IT) security. The acronym is pronounced "sim" with a silent "e."

The underlying principle of a SIEM system is that relevant data about an enterprise's security is produced in multiple locations and being able to look at all the data from a single point of view makes it easier to spot trends and see patterns that are out of the ordinary.

SIEM combines SIM (security information management) and SEM (security event management) functions into one security management system. A SEM system centralizes the storage and interpretation of logs and allows near real-time analysis which enables security personnel to take defensive actions more quickly. A SIM system collects data into a central repository for trend analysis and provides automated reporting.

By bringing these two functions together, SIEM systems provide quicker identification, analysis and recovery of security events. They also allow compliance managers to confirm they are fulfilling an organization's legal compliance requirements. SIEM systems collect logs and other security-related documentation for analysis. Most SIEM systems work by deploying multiple collection agents in a hierarchical manner to gather security-related events from end-user devices, servers, network equipment -- and even specialized security equipment like firewalls, antivirus or intrusion prevention systems. The collectors forward events to a centralized management console, which performs inspections and flags anomalies. To allow the system to identify anomalous events, it’s important that the SIEM administrator first creates a profile of the system under normal event conditions.

At the most basic level, a SIEM system can be rules-based or employ a statistical correlation engine to establish relationships between event log entries. In some systems, pre-processing may happen at edge collectors, with only certain events being passed through to a centralized management node. In this way, the volume of information being communicated and stored can be reduced. The danger of this approach, however, is that relevant events may be filtered out too soon.


SIEM systems are typically expensive to deploy and complex to operate and manage. While Payment Card Industry Data Security Standard (PCI DSS) compliance has traditionally driven SIEM adoption in large enterprises, concerns over advanced persistent threats (APTs) have led smaller organizations to look at the benefits a SIEM managed security service provider (MSSP) can offer.

Wednesday 19 November 2014

MOBILE APP DELIVERY

There's no shortage of mobile app delivery approaches to help business users get real work done on their smartphones and tablets.

Virtualization, application refactoring and enterprise app stores are all potential options, but IT pros must consider the cost, complexity and user-friendliness of each before making any decisions. Delivering Windows applications to mobile devices may be easy -- especially in shops that already use desktop or application virtualization -- but it doesn't always make for a great user experience. Native mobile apps are easier to use, but building, buying and deploying them can get tricky.

A good mobile app delivery strategy helps users do their jobs better and eases IT’s management burdens. App stores, Web and cloud apps and virtualization are some of IT’s options.

As more employees bring mobile devices and apps into the workplace, IT has several mobile app delivery and management options to consider.

One of the biggest risks of the consumerization of IT is that mobile device users exchange and store sensitive enterprise data without the necessary oversight. IT can limit these risks by controlling mobile app delivery, management and security.

Desktop and application virtualization are often the first technologies IT pros turn to when they need to deliver legacy software to mobile devices. Virtualization streams Windows applications -- which are designed for mouse-and-keyboard interfaces -- to mobile devices, which have touchscreens. As such, it may not provide the greatest user experience compared to apps that have been developed from the ground up for mobile. Refactoring could emerge as a beneficial middle ground.

There are four approaches worth considering:

1. Mobile app delivery with enterprise app stores

2. Using Web apps for mobile app delivery

3. Mobile app delivery via cloud

4. Mobile desktop virualization

Monday 13 October 2014

HYBRID CLOUD COMPUTING

What is Hybrid Cloud Computing?

Cloud computing has evolved in recent years. The new world of the hybrid cloud is an environment that employs both private and public cloud services. Companies are realizing that they need many different types of cloud services in order to meet a variety of customer needs.

The growing importance of hybrid cloud environments is transforming the entire computing industry as well as the way businesses are able to leverage technology to innovate. Economics and speed are the two greatest issues driving this market change.

There are two primary deployment models of clouds: public and private. Most organizations will use a combination of private computing resources (data centers and private clouds) and public services, where some of the services existing in these environments touch each other — this is the hybrid cloud environment.

The public cloud

The public cloud is a set of hardware, networking, storage, services, applications, and interfaces owned and operated by a third party for use by other companies or individuals. These commercial providers create a highly scalable data center that hides the details of the underlying infrastructure from the consumer.

Public clouds are viable because they typically manage relatively repetitive or straightforward workloads. For example, electronic mail is a very simple application. Therefore, a cloud provider can optimize the environment so that it is best suited to support a large number of customers, even if they save many messages.

Public cloud providers offering storage or computing services optimize their computing hardware and software to support these specific types of workloads. In contrast, the typical data center supports so many different applications and so many different workloads that it cannot be optimized easily.

The private cloud

A private cloud is a set of hardware, networking, storage, services, applications, and interfaces owned and operated by an organization for the use of its employees, partners, and customers. A private cloud can be created and managed by a third party for the exclusive use of one enterprise.

The private cloud is a highly controlled environment not open for public consumption. Thus, a private cloud sits behind a firewall. The private cloud is highly automated with a focus on governance, security, and compliance.

Automation replaces more manual processes of managing IT services to support customers. In this way, business rules and processes can be implemented inside software so that the environment becomes more predictable and manageable.

The hybrid cloud

A hybrid cloud is a combination of a private cloud combined with the use of public cloud services where one or several touch points exist between the environments. The goal is to combine services and data from a variety of cloud models to create a unified, automated, and well-managed computing environment.

Combining public services with private clouds and the data center as a hybrid is the new definition of corporate computing. Not all companies that use some public and some private cloud services have a hybrid cloud. Rather, a hybrid cloud is an environment where the private and public services are used together to create value.

A cloud is hybrid

If a company uses a public development platform that sends data to a private cloud or a data center–based application.

When a company leverages a number of SaaS (Software as a Service) applications and moves data between private or data center resources.

A cloud is not hybrid

If a few developers in a company use a public cloud service to prototype a new application that is completely disconnected from the private cloud or the data center.

If a company is using a SaaS application for a project but there is no movement of data from that application into the company’s data center.

Wednesday 3 September 2014

Why was the Windows XP lifespan so long?

Strong business customer influence

Corporations and small business love predictability. They tend to try to minimize big changes, at least at the operational level and gladly muddle through with the status quo, particularly if it's good enough to get the job done reliably.

Over the past dozen years, Microsoft did a pretty good job providing timely updates for old reliable XP, while at the same time exploring new user interface (UI) designs and the latest multiprocessor architectures as versions progressed up through Windows 8.

Performance was a hotly contested issue a decade ago, but not so much anymore. Business people buy machines to accomplish work and get paid. Sticking with Windows XP features made sense for a lot of small and midsize businesses because they matured quickly and updates didn't change the way people worked.

Windows XP also proved to be great for uneventful rollouts. IT techs liked the fact that the OS could run multiple versions of the same application without much worry that one would break another. For some reason, if a new version of some app died, you'd always have a fallback position.

While Windows XP features didn't change that much, hardware has rapidly improved.

New hardware

Through the mid-2000s, hardware made big leaps, quickly moving to 64-bit and multi-core processing. Windows XP ran well on these new machines, taking advantage of the higher clock speeds and larger memories.

Support staffers liked the fact that the underlying design of XP and its UI remained unchanged while becoming more stable over time. Any trouble spots became well-documented and fixes were readily available -- everybody knew how to fix Windows XP problems.

Over the past six or seven years, hardware capabilities have gone by the wayside. Most people aren't very impressed with a 3.0 GHz clock speed or having 8 GB of RAM. The corporate and business people want stability and a good return on investment, which means buying hardware and keeping it for as long as it can do the job well. The hardware has developed to such a state as to be both fast and reliable.

Half a decade ago, companies realized that they could lengthen the replacement cycle because the quality of the machines always seemed to get better and XP wasn't changing radically.

Sadly, the economy took a nosedive in 2008 and hasn't yet recovered. Companies continue to downsize, budgets have shrunk and everybody has had to tighten their belts. Of course, that situation is all the more reason for many enterprises to put off purchases of the latest hardware and the associated version of Windows.

Easy money

Microsoft realized years ago that not everybody wanted the latest and greatest products all the time. Business users especially wanted reliable, repeatable results without surprises.

So why fix something that isn't broken? And, as long as Windows XP and the fully-depreciated desktop still did the job, why give up a good thing?

In addition, Microsoft Office worked in perfect harmony with the OS which was another reason for the long Windows XP lifespan. Sure, there have been some upgrades over the years, but if you used Office applications 10 years ago, you can certainly pick it up and use it very effectively today.

What's next?

Companies are now faced with some computing challenges. They can upgrade their hardware and broker deals for newer versions of Windows or hold on to XP, without Microsoft's official support, until the hardware stops working.

Lot of companies are examining tablets and smartphone-based apps as a way to transition away from the traditional fixed desktop system. There's a lot of interest in, as well as fear of, bring your own device practices. The market seems to still be up in the air on how this will play out, though, especially in the areas of standard office-based computing tasks, point-of-sale applications and systems such as automated teller machines.

So, it will be an interesting time ahead as companies move off of XP and onto the newer versions of Windows. Many will successfully evolve over to cloud-connected tablets and smartphones, while some will move to Linux- or Apple-based notebooks.


Even as technology relentlessly moves on, the end of Windows XP support is a milestone as we step into the future.

Sunday 3 August 2014

SMAC

SMAC (social, mobile, analytics and cloud) is the concept that four technologies are currently driving business innovation.

SMAC creates an ecosystem that allows a business to improve its operations and get closer to the customer with minimal overhead and maximum reach. The proliferation of structured and unstructured data that is being created by mobile devices, sensors, social media, loyalty card programs and website browsing is creating new business models built upon customer-generated data. None of the four technologies can be an afterthought because it's the synergy created by social, mobile, analytics and cloud working together that creates a competitive advantage.

Social media has provided businesses with new ways to reach and interact with customers, while mobile technologies have changed the way people communicate, shop and work. Analytics allow businesses to understand how, when and where people consume certain goods and services and cloud computing provides a new way to access technology and the data a business needs to quickly respond to changing markets and solve business problems. While each of the four priorities can impact a business individually, their convergence is proving to be a disruptive force that is creating entirely new business models for service providers.

The integration of social, mobile, analytics and cloud requires clear policies and guidelines as well as management tools that can automate business processes. The media company Netflix is often cited as an example of a business that has successfully harnessed the power of SMAC. For example, when a Netflix member streams a TV show from the Netflix cloud to their iPad, they are given the option of signing into Neflix with Facebook's social login.

After viewing a show, members are given multiple ways to provide social feedback. They can rate content with stars, write reviews and/or share what they just watched with friends on Facebook or Twitter. Customer data is stored in the cloud and Netflix can break down its analysis to such a granular a level that its recommendation engine can personalize suggestions for individual family members who share the same account, a concept known as 1:1 marketing.


Proponents of SMAC as a customer relationship management (CRM) strategy believe that 1:1 marketing (also called one-off marketing) should be the ultimate goal of every SMAC initiative. Critics worry that 1:1 marketing initiatives that aggregate customer data from disparate sources, especially data that is purchased from data brokers, may violate customer privacy and cause legal problems related to compliance and data sovereignty.

Friday 11 July 2014

ROOTING ANDROID DEVICES

Users root their Android devices because it gives them unlimited access, but system administrators can root devices, too, then analyze everything that's happening on the phone. Rooting Android devices isn't for everyone, however.

Before you even start thinking about rooting Android devices, you should know that no vendor builds devices that are easy to root -- and for good reason. If users have complete access, they may change settings or permissions that cause their devices to stop working. And whether you or your users hack into devices, it will void the warranty on the device, the owner will lose support from the manufacturer, and in the worst case, it could cause the device to lose functionality completely. Some devices are even programmed to reset themselves to the original operating system upon rebooting, which means that you'll need to root the device over and over again.

To root or not to root?
So should you root Android devices? It depends.
If you need to thoroughly analyze app behavior on devices or build Android apps, it might make sense for you to root one device to test or build the apps before deploying them to all users. In this case, you could root an Android smartphone that you use only for development and testing. If neither of these situations apply to you, however, you're better of letting users' Android devices run their original operating systems

Rooting requirements

If you do decide to go for it, just know that rooting Android devices requires a ROM that is specifically made for the device you're trying to root. You upload that ROM to the device to replace the default operating system, but there are many different Android devices and OS versions, and even small hardware changes make it so a ROM for a similar device won't work on the exact device you're rooting. If you're planning to root a device so you can build and test applications, make sure to choose a device that has a ROM available.

You should also use a ROM that is from a trusted origin. It's not a good idea to just download any ROM from the Internet, because the ROM maker could have put anything in it, such as backdoors or other security holes. To avoid this problem, use a ROM from a source that is open enough to be verified, such as CyanogenMod.

In addition to getting the right ROM, you'll also need to prepare the device itself. Typically, that involves accessing the Settings menu on the device and selecting the USB debugging option from the Developer options. This allows you to work on the device when it is connected to the computer.

You also need to prepare a computer so that it can see the device. For a Windows computer, that means installing USB drivers for the specific Android device you're trying to root. If you're using a Linux computer, you'll have to create udev rules, but that is a fairly complicated task.

Once the computer is ready, you need to connect the device to the computer via USB, then load the ROM image onto the device. This is the all-or-nothing moment: If the root attempt fails, the phone will be "bricked" and you won't be able to do anything with it anymore. From a corporate perspective, the main problem with this procedure is not the risk of bricking a phone, but the absence of a good source for the ROMs that can be used for rooting.


If the process works, the phone will be rooted and you'll get complete access to it. To work in the root environment, you'll need a remote control tool such as the Android Debug Bridge developer tools. Alternatively, you could install an app on the rooted phone -- a shell environment or a full-featured app such as TWRP Manager -- that allows you to work on the device with root access.