Sunday, 30 September 2012

Big Data & The Cloud

A good presentation by Joe Ziegler at the 5th Elephant conference in Bangalore.

View to know WHY Cloud is Big Data's BEST FRIEND !

Eucalyptus - An Overview of On-premise IaaS and AWS

Presentation from AWS Worldwide Public Sector team's conference Building and Securing Applications in the Cloud (

Friday, 28 September 2012

Amazon RDS Now Supports SQL Server 2012

Want to try SQL Server 2012 ? Now dont invest in hardware and software. AWS RDS now supports SQL Server 2012 with easy to use interface and very affordable prices.

With added support for Microsoft SQL Server 2012, Amazon RDS customers can use the new features Microsoft has introduced as part of SQL Server 2012 including improvements to manageability, performance, programmability, and security.

Read more below extract from Jeff's Blog on the announcement :
The Amazon Relational Database Service (RDS) now supports SQL Server 2012.You can now launch the Express, Web, and Standard Editions of this powerful database from the comfort of the AWS Management Console. SQL Server 2008 R2 is still available, as are multiple versions and editions of MySQL and Oracle Database.

If you are from the Microsoft world and haven't heard of RDS, here's the executive summary:

You can run the latest and greatest offering from Microsoft in a fully managed environment. RDS will install and patch the database, make backups, and detect and recover from failures. It will also provide you with a point-and-click environment to make it easy for you to scale your compute resources up and down as needed.

What's New?

SQL Server 2012 supports a number of new features including contained databases, columnstore indexes, sequences, and user-defined roles:

  • A contained database is isolated from other SQL Server databases including system databases such as "master." This isolation removes dependencies and simplifies the task of moving databases from one instance of SQL Server to another.
  • Columnstore indexes are used for data warehouse style queries. Used properly, they can greatly reduce memory consumption and I/O requests for large queries.
  • Sequences are counters that can be used in more than one table.
  • The new user-defined role management system allows users to create custom server roles.

Read the SQL Server What's New documentation to learn more about these and other features.

You can launch SQL Server 2012 from the AWS Management Console. First you select the edition that best meets your needs:

Then you fill in the details (SQL Server 2012 is version 11), and your DB Instance will be launched in a matter of minutes:

Yes, This is Cool!

You can now get started with SQL Server 2012 without having to invest in hardware or buying a license. If you are eligible for the AWS Free Usage Tier,  you can get started without spending a penny. You can launch a DB Instance, evaluate the product, do a trial migration of your data, and learn all about the new features at minimal cost. When the time comes to move your organization to SQL Server 2012, you'll already have experience using it in a real-world environment. 

For more information on what’s new in SQL Server 2012, please visit Microsoft’s SQL Server 2012 MSDN documentation.

To learn more about using RDS for SQL Server 2012, please visit the Amazon RDS for SQL Server detail page, AWS documentation and FAQs.

Get Started With the vCloud Service Evaluation Beta Today!

The Wait Is Over – Get Started With the vCloud Service Evaluation Beta Today!

Good news – the waitlist for the vCloud Service Evaluation Beta has been removed! This means that users can now sign up today and get a public vCloud account in 15 minutes or less.

Announced last month, the vCloud Service Evaluation provides a quick, easy and low-cost way for you to learn about the advantages of a vCloud though hands-on testing and experimentation. All you need to sign up is a credit card and you can get your own public vCloud up and running in minutes.

vmware vCloud service evaluation beta

The vCloud Service Evaluation has all the basics you need, including a catalog of useful VM templates, virtual networking, persistent storage, external IP addresses, firewalls, load balancers, the vCloud API, and more. A variety of pre-built content templates are also available (at no charge) through the vCloud Service Evaluation, including WordPress, Jommia!, Sugar CRM, LAMP stack, Windows Server, etc.

For a limited time, you can also use the promo code “VMworld50” for a $50 credit towards your vCloud environment.

Looking for support? Technical How-To Guides available on are perfect for new vCloud users looking for implementation assistance.

vmware vCloud service evaluation beta 

In addition, signing up for the vCloud Service Evaluation gives you access to the vCloud Service Evaluation Community, where users can ask questions and get answers directly from others in the vCloud community.

vmware vCloud service evaluation beta

Your own vCloud is just a few clicks away – sign up for the vCloud Service Evaluation Beta (don’t forget to use the promo code, “VMworld50”) and set up your own vCloud today!

Re-Blogged from VMware Blogs and .

Installing AWS Command Line Tools from Amazon Downloads

A very well put up Blog on Installing AWS Command Line Tools from Amazon Downloads by Eric Hammond. Some useful extract from the Blog.

When you need an AWS command line toolset not provided by Ubuntu packages, you can download the tools directly from Amazon and install them locally.Unfortunately, Amazon does not have one single place where you can download all the command line tools for the various services, nor are all of the tools installed in the same way, nor do they all use the same format for accessing the AWS credentials.

The following steps show how to install and configure the AWS command line tools provided by Amazon [...]


Install required software packages:

sudo apt-get update
sudo apt-get install -y openjdk-6-jre ruby1.8-full libxml2-utils unzip cpanminus build-essential

Create a directory where all AWS tools will be installed:
sudo mkdir -p /usr/local/aws

Now we’re ready to start downloading and installing all of the individual software bundles that Amazon has released and made available in scattered places on their web site and various S3 buckets.
Download and Install AWS Command Line Tools

These steps should be done from an empty temporary directory so you can afterwards clean up all of the downloaded and unpacked files.

Note: Some of these download URLs always get the latest version and some tools have different URLs every time a new version is released. Click through on the tool link to find the latest [Download] URL.

EC2 API command line tools:
wget --quiet
unzip -qq
sudo rsync -a --no-o --no-g ec2-api-tools-*/ /usr/local/aws/ec2/

EC2 AMI command line tools:
wget --quiet
unzip -qq
sudo rsync -a --no-o --no-g ec2-ami-tools-*/ /usr/local/aws/ec2/

Thursday, 27 September 2012

Elastic Detector : Elastic Vulnerability Assessment


SecludIT developed a new approach to vulnerability assessment by using the elasticity of IaaS: Elastic Vulnerability Assessment - EVA.
Elastic Detector is Secludit's fully automated security event detection tool for Amazon EC2. 
It helps administrators and users of Amazon EC2-based infrastructures to continuously identify holes on security groups and applications, thus dramatically reducing the risk of external and internal attacks. 
It is Delivered as SaaS or Virtual Appliance (currently only running on US East Region).
In contrary to existing tools, you don’t need to install any additional software, such as agents, and do not need to configure any monitors up-front.
If you want to know more about Elastic Detector, watch the video below or try the service for free under

AWS Announcement : High Performance Provisioned IOPS Storage For Amazon RDS

After announcing EBS Provisioned IOPS offering lately which allows you to specify both volume size and volume performance in term of number of I/O operations per second (IOPS),  AWS has now announced High Performance Provisioned IOPS Storage for Amazon RDS.
You can now create an RDS database instance and specify your desired level of IOPS in order to get more consistent throughput and performance.

Amazon RDS Provisioned IOPS is immediately available for new database instances in the US East (N. Virginia), US West (N. California), and EU West (Ireland) Regions and AWS plan to launch in other AWS Regions in the coming months.
AWs is rolling this out in two phases. Read on more the extract from the announcement on AWS Blog by Jeff.
                   We are rolling this out in two stages. Here's the plan:
  • Effective immediately, you can provision new RDS database instances with 1,000 to 10,000 IOPS, and with 100GB to 1 TB of storage for MySQL and Oracle databases. If you are using SQL Server, the maximum IOPS you can provision is 7,000 IOPS. All other RDS features including Multi-AZ, Read Replicas, and the Virtual Private Cloud, are also supported.
  • In the near future, we plan to provide you with an automated way to migrate existing database instances to Provisioned IOPS storage for the MySQL and Oracle database engines. If you want to migrate an existing database instance to Provisioned IOPS storage immediately, you can export your data and re-import it into a new database instance equipped with Provisioned IOPS storage.

We expect database instances with RDS Provisioned IOPS to be used in demanding situations. For example, they are a perfect host for I/O-intensive transactional (OLTP) workloads.
We recommend that customers running production database workloads use Amazon RDS Provisioned IOPS for the best possible performance. (By the way, for mission critical OLTP workloads, you should also consider adding the Amazon RDS Multi-AZ option to improve availability.)

Check out the video with Rahul Pathak of the Amazon RDS team to learn more about this new feature and how some of AWS customers were using it:

Responses from AWS customers :

  • AWS customer Flipboard uses RDS to deliver billions of page flips each month to millions of mobile phone and tablet users. Sang Chi, Data Infrastructure Architect at Flipboard told us:
"We want to provide the best possible reading and content delivery experience for a rapidly growing base of users and publishers. This requires us not only to use a high performance database today but also to continue to improve our performance in the future. Throughput consistency is critical for our workloads. Based on results from our early testing, we are very excited about Amazon RDS Provisioned IOPS and the impact it will have on our ability to scale. We’re looking forward to scaling our database applications to tens of thousands of IOPS and achieving consistent throughput to improve the experience for our users."
  • AWS customer Shine Technologies uses RDS for Oracle to build complex solutions for enterprise customers. Adam Kierce, their Director said:
"Amazon RDS Provisioned IOPS provided a turbo-boost to our enterprise class database-backed applications. In the past, we have invested hundreds of days in time consuming and costly code based performance tuning, but with Amazon RDS Provisioned IOPS we were able to exceed those performance gains in a single day. We have demanding clients in the Energy, Telecommunication, Finance and Retail industries, and we fully expect to move all our Oracle backed products onto AWS using Amazon RDS for Oracle over the next 12 months. The increased performance of Amazon's RDS for Oracle with Provision IOPS is an absolute game changer, because it delivers more (performance) for less (cost)."

Wednesday, 26 September 2012

AWS Week in Review - September 17th to September 23rd, 2012


Let's take a quick look at what happened in AWS-land last week:

Monday, September 17

§   AWS added four new checks to the AWS Trusted Advisor including EC2 Reserved Instance Optimization, VPN Tunnel Redundancy, RDS Backup, and RDS Multi-AZ.

Tuesday, September 18

§   AWS introduced Auto Scaling termination policies to give you additonal control over the scale-down process.

Friday, September 21

§ Developer Andy Chilton released version 0.11.0 of Awssum, a collection of node.js modules for AWS.


Infographi​c: Young Profession​als & Risky Tech Behavior

Based on the 2011 Cisco Connected World Technology ReportBACKGROUNDCHECK.ORG have compiled an infographic taking a look at the risky tech-related behavior young professionals engage in (in terms of device and password management, extreme internet behavior, and lost or stolen devices).

Young Professionals & Risky Tech Behavior


Infographic : Hypervisor Tug-of-War

We've described the battle between VMware and Microsoft as a hypervisor war going back as far as the first release of Microsoft Virtual Server. Unfortunately for Microsoft, that war was pretty ugly. On the one side, VMware had guns and cannons, and on the other side, Microsoft was throwing rocks. Ok, maybe it wasn't quite that bad, but you get the picture.
Fast forward and Microsoft has added a lot of weaponry with the introduction of Microsoft Hyper-V, the company's true hypervisor. And the Redmond giant's offering just keeps on getting better. But, so too does the ESX hypervisor from VMware. A virtual tug-of-war if you will.
Check out this latest Infographic coming from the folks at SolarWinds
Instead of depicting this as a hypervisor war, the InfoGraphic calls it the Hypervisor Tug-of-War.
The survey information clearly shows that VMware remains in control. However, there are some interesting data points in there as well.
  • Survey responders believe that VMware is far and away going to be the most trusted private cloud vendor, with a whopping 76.9%
  • However 76.1% believe Microsoft will close the functionality gap with VMware thanks to Hyper-V 3.0
  • While trying to break beyond the barrier of VM Stall, it sounds like 56.8% of key applications will be deployed on a private cloud next year, while 57.7% have already moved beyond the 40% mark of their datacenter being virtualized
  • 70% of end users still deploy more than one virtualization tool
And at the bottom, we see that VMware still commands a lead in the hypervisor tug-of-war.
Reblogged from VMBlog
Infographic by SolarWinds

Monday, 24 September 2012

Best Practices For Companies In The Cloud

Best Practices For Companies In The Cloud...


Infographic: The Big Data Boom

Our age has moved into a complex situation. On the one hand, we are experiencing an ever-increasing level of home clouds and, while on the other hand, commodity data services are hungrily intensifying. In the midst of this chaos, big data and cloud computing have secured their bond, and their friendship has come forward. This has prompted a sigh of relief from the user group.

Data is a common problem in organizations, and hence they want to figure out how to get their enterprise data assets under control. The friendship mantra of big data and the cloud can be verified by analyzing the emergence of cloud computing with the processing power to manage exabytes of information. Being able to handle large amounts of information is a priority for big enterprises in the industry because organizations are trying to get their data under control. Nevertheless, learning from the great success of big data in the cloud world, many governments have also become active players in the cloud domain.

Emerging big data trends show that organizations are getting to the analytical states of their processes, gaining the ability to determine value, and getting to know what their data is doing in a particular state of their business. This also requires the combination of huge amounts of data into common, accessible points that provide mission-critical business intelligence (BI). Data warehousing and the ability to look the value of information, either in an operation state or in a decision support state, are also factors of prime importance.

Nearly all the latest market trends point to the notion of being agile and being customer responsive. Interestingly, not all big data cloud servers are the same. The technology that Microsoft provides is completely different from the technology that Google provides. The time it takes to push a big data project to completion inevitably depends on the technology used by the server. This leads us to the question of which service provider to choose and which to ignore.

To support an adaptive organization, big data is one of the critical elements. The ability of cloud computing to provision computer resources, storage, network capacity and, above all, do all this magic at a moment’s notice makes big data and the cloud the dearest of friends. There is a long way to go, but for now they seem to be friends for life.

Source :
Infographic Source: Netuitive
Write Up Source : Big Data And Cloud Computing – Friends For Life

Friday, 21 September 2012

Pros and Cons of Hybrid Cloud

Benefits and Limitations of investing in a Hybrid Cloud

Pros and Cons of Hybrid cloud


Pros and Cons of Private Cloud

Benefits and Limitations of investing in a Private Cloud

Pros and Cons of Private Cloud


Pros and Cons of Public Cloud

Benefits and Limitations of investing in a Public Cloud


Pros and Cons of Public Cloud


Infographi​c: America's Most Innovative Colleges

In late June 2012, revealed the winners of its 2012 Innovators Awards, which are given out to IT leaders who have come up with extraordinary technology solutions for campus challenges.

Find below the compiled information into an infographic to showcase the projects and technologies used by the winning colleges.

Innovations on Campus

Put Together By:

Wednesday, 19 September 2012

AWS Week in Review - September 10th to September 16th, 2012


Let's take a quick look at what happened in AWS-land last week:
Tuesday, September 11
Wednesday, September 12
Thursday, September 13


Sign in to and Use the AWS Account

In this guide, lets understand below topics:
  • Understand AWS Manament Console
  • Different ways of logging into the AWS Management Console.
  • Sign into the AWS Management Console using the Account level credentials
  • Configure AWS Management Console as per your suitability.

AWS Management Console

AWS Management Console simple definition by AWS :
Access and manage Amazon’s growing suite of infrastructure web services through a simple and intuitive, web-based user interface. The AWS Management Console provides convenient management of your compute, storage, and other cloud resources.

AWS constantly keeps pushing new features and support for various services in the console. If any feature is not available throught the AWS Management Console, the user must employ the APIs and/or SDKs provided by AWS.

In AWS, there are basically two different ways for a user to sign in to the AWS Management Console for handling the services:
  • Using the Account level credentials
Consider this as a "POWER USER LOGIN" (this is a term coined by me to set perspective and not by AWS)
A user can sign in using the typical AWS Console login URL. The user must use email-address using which the account is created and password provided.
This way of sign in allows the user complete control over the AWS services, resources and  account management.
In this guide we will be concentrating more on Account-level login to the AWS Management Console.  
  • Amazon Indentity and Access Management (IAM) User
Consider this as a "ACCESS CONTROLLED USERs"(this is a term coined by me to set perspective and not by AWS)
In case there is need of more than 1 user to login to the AWS account, you can use the IAM Service. Each user may have same or different access controls over the various AWS services and resources. The users can sign into the console using a different alias, specific to your account and using specific user login name & password. These privilieges are not only applicable to the AWS Management Console. The same can be applied to the use of SDKs and APIs. This can be achieved by creating user specific Access Keys and Secret Keys.
IAM also enables identity federation between your corporate directory and AWS services.
I'm NOT covering IAM user login in this tutorial, but I will surely write a guide on the topic and provide updated links.

Sign into the AWS Management Console using the Account level credentials :

Tuesday, 18 September 2012

Amazon VPC - New Additions

AWS has added 3 new features / options to the  Amazon Virtual Private Cloud (VPC) service.
PFB extract for the two blogs written by Jeff on the same:
The Amazon Virtual Private Cloud (VPC) gives you the power to create a private, isolated section of the AWS Cloud. You have full control of network addressing. Each of your VPCs can include subnets (with access control lists), route tables, and gateways to your existing network and to the Internet.
You can connect your VPC to the Internet via an Internet Gateway and enjoy all the flexibility of Amazon EC2 with the added benefits of Amazon VPC. You can also setup an IPsec VPN connection to your VPC, extending your corporate data center into the AWS Cloud. Today we are adding two options to give you additional VPN connection flexibility:
  1. You can now create Hardware VPN connections to your VPC using static routing. This means that you can establish connectivity using VPN devices that do not support BGP such as Cisco ASA and Microsoft Windows Server 2008 R2. You can also use Linux to establish a Hardware VPN connection to your VPC. In fact, any IPSec VPN implementation should work.
  2. You can now configure automatic propagation of routes from your VPN and Direct Connect links (gateways) to your VPC's routing tables. This will make your life easier as you won’t need to create static route entries in your VPC route table for your VPN connections. For instance, if you’re using dynamically routed (BGP) VPN connections, your BGP route advertisements from your home network can be automatically propagated into your VPC routing table.
If your VPN hardware is capable of supporting BGP, this is still the preferred way to go as BGP performs a robust liveness check on the IPSec tunnel. Each VPN connection uses two tunnels for redundancy; BGP simplifies the failover procedure that is invoked when one VPN tunnel goes down.

Sunday, 16 September 2012

Cancel an AWS Account

AWS allows users to cancel their AWS account.

If you wish to cancel your AWS account follow the below steps:

  • Login to your AWS Account as a returning user by selecting the option : "I am a returning user and password is:" , click on Sign in using our secure server button.

Create an AWS Account - Free Usage Tier

Amazon Web Services helps its new customers to get started into the cloud by introducing a free usage tier. This tier is available to customers for 12 months. 

Below are the highlights of AWS’s free usage tiers. All are available for one year (except SWF, DynamoDB, SimpleDB, SQS, and SNS which are free indefinitely):

Below Image is updated as of October 1st,2012 for AWS RDS announcement. For latest updates, please check AWS Free Tier for more details


Do check AWS Free Tier for more details.

How to get started:

Saturday, 15 September 2012

Infographic: Why are more more businesses moving to the cloud?

The cloud is one of the quickest growth areas in the IT sector, more and more businesses are using the cloud for their day to day processes and according to analysts TechMarketView, the UK market for cloud computing reached £1.2bn in 2011, 38 percent higher than the previous year. That’s some serious growth!

But why are businesses moving to the cloud? The reality is the cloud is fast becoming hard to ignore and in the infographic below we take a look at why:


Amazon RDS News - Oracle Data Pump

The Amazon RDS team is rolling out new features at a very rapid clip.
The most awaited feature - Oracle Data Pump is finally here.

Extract from blog post by Jeff:
Customers have asked us to make it easier to import their existing databases into Amazon RDS. We are making it easy for you to move data on and off of the DB Instances by using Oracle Data Pump. A number of scenarios are supported including:
  • Transfer between an on-premises Oracle database and an RDS DB Instance.
  • Transfer between an Oracle database running on an EC2 instance and an RDS DB Instance.
  • Transfer between two RDS DB Instances.
These transfers can be run in either direction. We currently support the network mode of Data Pump where the job source is an Oracle database. Transfers using Data Pump should be considerably faster than those using the original Import and Export utilities. Oracle Data Pump is available on all new DB Instances running Oracle Database To use Data Pump with your existing v3 and v4 instances, please upgrade to v5 by following the directions in the RDS User Guide. To learn more about importing and exporting data from your Oracle databases, check out our new import/export guide.

For those who are not aware what Oracle Data Pump is -

Oracle Data Pump is a feature of Oracle Database 11g Release 2 that enables very fast bulk data and metadata movement between Oracle databases. Oracle Data Pump provides new high-speed, parallel Export and Import utilities (expdp and impdp) as well as a Web-based Oracle Enterprise Manager interface.

  • Data Pump Export and Import utilities are typically much faster than the original Export and Import Utilities. A single thread of Data Pump Export is about twice as fast as original Export, while Data Pump Import is 15-45 times fast than original Import.
  • Data Pump jobs can be restarted without loss of data, whether or not the stoppage was voluntary or involuntary.
  • Data Pump jobs support fine-grained object selection. Virtually any type of object can be included or excluded in a Data Pump job.
  • Data Pump supports the ability to load one instance directly from another (network import) and unload a remote instance (network export).

Friday, 14 September 2012

Amazon EC2 Reserved Instance Marketplace

Superbly detailed blog Post by Jeff on Amazon EC2 Reserved Instance Marketplace

No more words need to be added....

EC2 Options
I often tell people that cloud computing is equal parts technology and business model. Amazon EC2 is a good example of this; you have three options to choose from:
  • You can use On-Demand Instances, where you pay for compute capacity by the hour, with no upfront fees or long-term commitments. On-Demand instances are recommended for situations where you don't know how much (if any) compute capacity you will need at a given time.
  • If you know that you will need a certain amount of capacity, you can buy an EC2 Reserved Instance. You make a low, one-time upfront payment, reserve it for a one or three year term, and pay a significantly lower hourly rate. You can choose between Light Utilization, Medium Utilization, and Heavy Utilization Reserved Instances to further align your costs with your usage.
  • You can also bid for unused EC2 capacity on the Spot Market with a maximum hourly price you are willing to pay for a particular instance type in the Region and Availability Zone of your choice. When the current Spot Price for the desired instance type is at or below the price you set, your application will run.
Reserved Instance Marketplace
Today we are increasing the flexibility of the EC2 Reserved Instance model even more with the introduction of the Reserved Instance Marketplace. If you have excess capacity, you can list it on the marketplace and sell it to someone who needs additional capacity. If you need additional capacity, you can compare the upfront prices and durations of Reserved Instances on the marketplace to the upfront prices of one and three year Reserved Instances available directly from AWS. The Reserved Instances in the Marketplace are functionally identical to other Reserved Instances and have the then-current hourly rates, they will just have less than a full term and a different upfront price.