Saturday, December 29, 2012

Cisco Stares into the Developing Cloud

Networking company Cisco Systems released its second annual Cisco Global Cloud Index in October of 2012. The Index examines the current state of cloud computing and extrapolatesfuture developments.
If Cisco's predictions are correct, we're teetering on the brink of a major networking shift from fixed IT networks to cloud data centers. Cisco suggests 2014 will see the turning point, at which time over 50 percent of all workloads will shift to the cloud. The Index defines a workload as the processing power required by a server to run an app and support users interacting with the app.

Global Cloud Cover
The growing importance of cloud data centers is readily apparent; whether you’re an individual or a company for SEO at least some of your data and apps now live in the cloud. Cisco's Index, however, indicates just how thick the cloud cover will be in 2016.
According to the Index, the Asian-Pacific region will generate more cloud traffic in 2016 than either North America or Western Europe, despite Western Europe's edge in network strength and broadband accessibility. By 2016, Asia-Pacific is expected to generate 36 percent of the global cloud workload.
Regions such as Hong Kong, Japan, South Korea, Singapore and Taiwan are already positioned to profit from increased cloud workload, with China, Thailand, Cambodia and Vietnamincreasingly likely to add to the cloud.
While the Asian-Pacific will dominate the cloud, it won’t be the fastest developing adopter of cloud technology. That title goes to the Middle East and Africa, which Cisco expects to show rapid cloud growth through 2016.

Cloud Thickness
Presuming Cisco's predictions pan out, by 2016 the cloud will process an overwhelming amount of data, estimated at 6.6 zettabytes a year. A zettabyte equals one sextrillion bytes, or 1,000,000,000,000,000,000,000 bytes. Approximately 76 percent of this data will be internal data center traffic, according to Cisco estimates.

Mobile Lags Behind
While fixed networks across the globe are ready to handle intermediate and basic cloud apps, no region has yet to emerge capable of handling advanced apps. Cisco defines basic apps as small games and text-based apps.
Intermediate apps include programs such as iTunes, Enterprise Resource Planning and Customer Relationship Planning apps. Advanced apps include high-definition video streaming and video conferencing.
Some are already prepared for advanced cloud apps, including Hong Kong, Japan, Romania, South Korea and Sweden. They are the onlyones currently capable of handling advanced cloud apps over a fixed network.
As for mobile cloud apps, no country is ready for advanced cloud app use. Western Europe comes closest, being capable of handling intermediate cloud apps over mobile access.
Cisco's Index confirms what many analysts already know: cloud data centers represent an important, and potentially lucrative, field of development. Eventually, we're all going to live under the cloud.

My dear friend Michelle is an aspiring writer and blogger with a passion for the Internet, specifically social media and blogging. She loves how social media connects people across the globe, and appreciates that blogging gives her the opportunity to voice her thoughts and share advice with an unlimited audience.

Saturday, October 13, 2012

How is Cloud Computing Changing IT and Other Industries?

The advent and growing clamour for and popularity of cloud computing represents perhaps the biggest change ever experienced in IT and technology circles. Bigger than tablet technology, bigger than anything Apple has ever done, a bigger change even, than perhaps the first ever Windows operating system.
With change comes resistance, however once the resistance has worn away then comes the real reason change is so great: the opportunity.
How is cloud computing revolutionising the industry both from a consumer and business point of view?

PC Purchases
For the time being, manufacturers of personal computers are still packing their hardware with as much hard disk space as possible, and charging consumers ever-growing amounts of money for the privilege of an extra gigabyte or four.
How much of that hard disk space ever actually gets used? In most cases, the answer will be not all of it. In fact, the answer will likely to be less than half. So why should consumers pay for something they don’t use?
Sooner or later, consumers will start saying no to high prices for gigabyte after gigabyte of disk space, realising they can make savings on buying a cheaper laptop and just using a free cloud service and paying the small fee for any additional space they may need.

Job Opportunities
For the doom-mongers who believe humans are soon to rendered useless, cloud computing is a disaster. It is yet another example of the world finding a way to automate tasks that can be done quicker and more efficiently than if a human was doing them.
While this is true to some extent, the opportunities afforded by cloud computing are actually likely to make the reliance on IT grow, thus creating jobs. The affordable nature of cloud computing also means that start-up businesses can reduce their costs – funds that could go to bringing additional employees on board.
This is just scratching the surface, too, as cloud observation, security maintenance, and other human completed tasks will become much more in demand.

Business Added Value
Opinion is split over whether this factor is a good thing or not, but cloud computing will, eventually, force IT vendors to look at what else they can offer in terms of products to the consumer world.
It is not only IT vendors that will need to evolve, either. Retailers of entertainment products will begin to see interest in their stores dwindle as eventually computer games move exclusively online. DVD sales are suffering thanks to the likes of Netflix, much like CD sales dipped dramatically when iTunes became so popular.
While these changes will mean increases profits for the businesses who aren’t paying middleman fees, it leaves those reliant on interest in particular products and services facing a challenging future.

Posterita, who helped me in this post, is a retail software that allows chain stores and single stores to manage every aspect of their operations via an easy-to-use web-based platform.

Wednesday, September 26, 2012

What the Higgs Boson Particle Could Mean for Cloud Computing

Most businesses have come to realize the cost-saving potential of using cloud computing. Large and reliable resources for data storage have become essential, whether a company specializes in sales, marketing or particle acceleration. Making use of the cloud typically involves allotting resources over a network, while grid computing is a similar concept, but involves computing resources outside of a single network.

Both of these systems have been widely analyzed for managing data and computing resources. Companies can outsource their computing needs to a third party, just as an SEOcompany can be enlisted for online marketing, website creation and traffic monitoring. Small businesses and big corporations alike benefit from such services, even a company like the European Organization for Nuclear Research (CERN). The company recently used its Large Hadron Collider in the discovery of a once theoretical particle.

Cloud Computing and Particle Physics

With a 17-mile-long tunnel and repeated experiments, CERN produces a lot of data. As much as 1 GB per second is recorded in the facility’s datacenter. This adds up to so much data that even Moore’s Law, the principle that states memory capacity should double every 18 months, is not enough to keep up. Data are measured in petabytes, one of which is equal to a million gigabytes. Time during experiments is measured in nanoseconds; with an estimated 600 million particle collisions per second during experiments, up to a petabyte of information can be collected during each run.

Many businesses use a public cloud, but CERN has created a grid network consisting of 150 different computer systems spread around the world. Ethernet connections link most of them, allowing for the use of 300,000 cores to drive computational operations. When the Higgs boson particle, theoretically the generator of mass, was found, enormous quantities of data were created for collision analysis, particle path tracking and statistical analysis.

Progressive Efforts

European research organizations and CERN have begun experimenting with a cloud resource called Helix Nebula – The Science Cloud. The project was implemented in March 2012 and is a partnership between various research groups, IT businesses and cloud vendors. Simulations from the particle collider have already been run through this network as part of a two-year study.

Many scientists consider the particle discovery one of the most important in recent times, while the cloud has become perhaps the most significant aspect of computing for businesses. The Higgs boson particle itself may not yet have applications directly related to IT, but the research surrounding it seems to be driving a push to advance public computing resources on a wide scale. Data security is important for businesses no matter what their focus is.

Byline: My best friend Michelle is a Content Specialist and Blogger with a passion for the Internet, specifically social media and blogging. She loves how social media connects people across the globe, and appreciates that blogging gives her the opportunity to voice her thoughts and share advice with an unlimited audience.

Wednesday, September 19, 2012

How to renew Alfresco Solr SSL certificate

This information is only applicable to Alfresco (Enterprise) 4.x versions
There needs to be two-way communication between the Alfresco server and the SOLR server. So that no one else can abuse this communication channel, it must be secured by means of HTTPS encryption and a mutual client certificate authentication.
There are three important points involved in setting up this mutual trust relationship:
  • Creating a 'keystore directory' and configuring the Alfresco and Solr servers to use it.
  • Generating and installing your own 'secure certificates'.
  • Replacing default certificates and handling 'certificate expiry'.

If you installed Alfresco and SOLR via the Installation Wizard, there is no need to perform step 1, as the directory and associated configuration will already be present. You can proceed straight to step 2.
If you installed SOLR manually, then please carefully review steps 1 and 2 - as otherwise, without configuring your own keystore directory, you may be picking up expired, default keys.

1. Creating a keystore directory and configuring the Alfresco and Solr Servers to use it
The following instructions assume SOLR has already been extracted and configured.
We will use to refer to the tomcat directory where Alfresco is installed and to the tomcat directory where Solr is installed. These may be the same or different directories, depending on whether you have chosen to install Solr on a standalone server or the same server as Alfresco.
  • Ensure that Alfresco has already been started at least once, i.e. the /webapps/alfresco/WEB-INF directory exists.
  • Create and populate a keystore directory for the Alfresco and SOLR servers. By convention, we will create this in /alf_data/keystore. Please note that at this stage the keystore directory will just be a template, containing standard keys available to everybody. To secure the installation you must carry out the steps to generate new keys, specified in section 2.
    • Linux/Unix:

          mkdir -p /alf_data/keystore
          cp /webapps/alfresco/WEB-INF/classes/alfresco/keystore/* /alf_data/keystore

    • Windows
                    mkdir \alf_data\keystore
                    copy \webapps\alfresco\WEB-INF\classes\alfresco\keystore\* \alf_data\keystore

  • Configure the Alfresco and SOLR tomcats to use the keystore and truststore for https requests, by editing the specification of the connector on port 8443 in /conf/server.xml and /conf/server.xml as follows, remembering to replace /alf_data/keystore with the full path to your keystore directory

< maxThreads="150" scheme="https" keystoreFile="/alf_data/keystore/ssl.keystore" keystorePass="kT9X6oe68t" keystoreType="JCEKS"
secure="true"connectionTimeout="240000" truststoreFile="/alf_data/keystore/ssl.truststore"truststorePass="kT9X6oe68t" truststoreType="JCEKS"
clientAuth="false" sslProtocol="TLS" />
  • Configure Alfresco itself to use the keystore and truststore for client requests to SOLR, by specifying dir.keystore in ALFRESCO_TOMCAT_HOME/shared/classes/alfresco-global.properties, remembering to replace /alf_data/keystore with the full path to your keystore directory
          dir.keystore=/alf_data/keystore
  • Configure an identity for the Alfresco server. In /conf/tomcat-users.xml, add the following. Note that you can choose a different username, such as the host name of the Alfresco server, but it must match the REPO_CERT_DNAME you will later specify in the keystore in section 2.
  • Configure an identity for the Solr server. In /conf/tomcat-users.xml, add the following. . Note that you can choose a different username but it must match the SOLR_CLIENT_CERT_DNAME you will later specify in the keystore in section 2.
  • To complete the installation, it’s necessary to secure communications by generating your own keys. See section 2.
2. Generating and installing your own secure certificates
Use these instructions to replace or update the keys used to secure communications between Alfresco and SOLR, using secure keys specific to your Alfresco installation.
NOTE: If applying these instructions to a clustered installation, the steps should be carried out on a single host and then the generated .keystore and .truststore files must be replicated(used) on all other hosts in the cluster.
The following instructions assume that solr has been extracted and a keystore directory has already been created, either automatically by the Alfresco installer, or manually by following the instructions in section 1.
  • Obtain the file generate_keystores.sh (for Linux and Solaris) orgenerate_keystores.bat (for Windows) from the Customer Support website under 'Online Resources > Downloads > Alfresco Enterprise 4.x > '
  • Edit the environment variables at the beginning of the file to match your environment
If you are updating an environment created by the Alfresco installer, you will only need to edit ALFRESCO_HOME to specify the correct installation directory
For manual installations, carefully review ALFRESCO_KEYSTORE_HOME, SOLR_HOME, JAVA_HOME, REPO_CERT_DNAME and SOLR_CLIENT_CERT_DNAME and edit as appropriate.
  • Run the edited script
  • You should see the message 'Certificate update complete' and another message reminding you what dir.keystore should be set to in alfresco-global.properties
3. Replacing default certificates and handling certificate expiry
If you see errors such as the following in the logs, it means that the expiry date set in one or more of your SSL certificates has passed.
21:52:14,109 ERROR [org.quartz.core.ErrorLogger] Job (DEFAULT.search.archiveCoreBackupJobDetail threw an exception. 
org.quartz.SchedulerException: Job threw an unhandled exception. [See nested exception: org.alfresco.error.AlfrescoRuntimeException: 07180158 Bakup for core archive feailed .... ] 
at org.quartz.core.JobRunShell.run(JobRunShell.java:227) 
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:563) 
Caused by: org.alfresco.error.AlfrescoRuntimeException: 07180158 Backup for core archive failed .... 
at org.alfresco.repo.search.impl.solr.SolrBackupClient.executeImpl(SolrBackupClient.java:158) 
at org.alfresco.repo.search.impl.solr.SolrBackupClient.execute(SolrBackupClient.java:112) 
at org.alfresco.repo.search.impl.solr.SolrBackupJob.execute(SolrBackupJob.java:58) 
at org.quartz.core.JobRunShell.run(JobRunShell.java:216) 
... 1 more 
Caused by: org.apache.solr.client.solrj.SolrServerException: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path validation failed: java.security.cert.CertPathValidatorException: timestamp check failed

It is recommend to generate new secure certificates following the instructions in section 2 
As a temporary measure, you can substitute all your existing .keystore, .truststore and .p12 files with the new Alfresco default files. These can be found in zip file'keystores.zip' available in the support website download section with the generate keystore scripts.

There are numerous locations for these files in the Alfresco/SOLR install, you must find and replace all the .keystore, .truststore and .p12 files with the new secure certificates

This is an example list of typical (v4.0.2.x ) file paths to be updated is below, but please be aware these files may be located in different relative locations in your system:
/alf_data/keystore/browser.p12
/alf_data/keystore/ssl.truststore
/alf_data/keystore/ssl.keystore
/alf_data/solr/workspace-SpacesStore/conf/ssl.repo.client.truststore
/alf_data/solr/workspace-SpacesStore/conf/ssl.repo.client.keystore
/alf_data/solr/archive-SpacesStore/conf/ssl.repo.client.truststore
/alf_data/solr/archive-SpacesStore/conf/ssl.repo.client.keystore
/alf_data/solr/templates/test/conf/ssl.repo.client.truststore
/alf_data/solr/templates/test/conf/ssl.repo.client.keystore
Use the generate keystore script provided with the Alfresco Enterprise version you are updating
In the case of version 4.0.2, ideally it is best to update your install to 4.0.2.9, else use the scripts found under 4.0.2.9 downloads for 4.0.2 secure certificate generation
The /alf_data/solr/templates directory does not exist in 4.0, 4.0.1 installs.
Users connecting directly to SOLR web app will need to replace their browser.p12 file with the new one (http://docs.alfresco.com/4.0/topic/com.alfresco.enterprise.doc/tasks/solr-SSL-connecting.html)
(cluster) If applying these instructions to a clustered installation, the .keystoreand .truststore files must be replicated (used) on all other hosts in the cluster.




Friday, August 17, 2012

Does Cloud Computing Hold Security Risks?

No matter what business you walk into, there will be some measure of security when it comes to their systems. Whether it is individual user identification for logging into systems and having certain privileges, or certain passwords to access particular programs, there will have been some measure taken to restrict access.

The Cloud
In recent years, cloud computing has emerged as a popular innovation and a great platform for businesses. The biggest benefit from a business point of view has been the reduction of both short and long-term costs associated with data storage. Cloud computing has allowed businesses to begin storing data online, at a much cheaper cost than they were previously paying for hard disk space and servers, which could then potentially slow down computer systems and make the workforce less productive.
Of course, for everything that brings a benefit, there are dangers that are associated with it, especially when platforms are relatively new, as is the case with cloud computing.

Carelessness
The biggest risk with cloud computing is unquestionably security. Unless cloud storage is set up to only be accessible from specific ports or through an intranet, for example, any business employee could access their work information at home or even during a night out with friends.
I am sure that the thought of an employee sharing business information with their friends and anyone else within earshot over a few drinks would send a shiver down the spine of most managers. Yes, people discuss work in their social lives, but do they always pull out their phone to prove a point, unwittingly revealing confidential business information? This is a distinct possibility and one which has seen human resources employee’s furiously rewriting “Communications in the Public Domain” guidelines for employees to follow.

Tracing Difficulties
One of the common problems businesses face when everyone is privy to a password, as is the case in many instances, is that if something goes wrong there is no real way to find out who was responsible. Cloud computing solves this problem only if businesses assign each individual employee an access name and password. Unless this is done then security will be further compromised.

Public Networks
Perhaps the biggest problem with cloud computing and storage is that, because it is designed to be used on the go, professionals everywhere are potentially connecting to public Wi-Fi networks without knowing who else is on the network and what they are potentially doing. This could easily lead to business’ information being placed into the public domain, as well as increase the possibility of a business or individual becoming a victim of fraud.

Posterita is an inventory management system that allows chain stores and single stores to manage every aspect of their operations via an easy-to-use web-based platform.

Wednesday, August 1, 2012

5 benefits of Cloud Computing to your Business

Cloud computing services have improved rapidly over the past several years to the point where they offer many clear advantages to businesses. These are just 5 of them:

1) It’s time-saving. Rather than having to manually install computer updates everything is taken care of in the cloud. Because pretty much any device can be connected to the cloud, so if one computer doesn’t work you can move to a different device without having to wait for it to be repaired.

2) You can respond to growth easily. IT constraints have often stopped businesses from growing as rapidly as they otherwise could have. Cloud computing allows businesses to easily upscale their requirements without the huge investments needed by traditional IT solutions. Conversely, if you suddenly find your IT requirements shrinking, you are not left with expensive and under-used equipment – you can simply downgrade your cloud computing package. 

3) You can cut costs. There are many ways that cloud computing can save businesses money. For example, your staff costs will be reduced because you don’t need the same level of in-house technical support that you would if you housed your own IT system. You’ll also save money on the hardware required by non-cloud computing approaches.   

4) You get access to the latest technology. You don’t need to worry about updating the latest software as your service provider will make sure everything is up to date for you. Often, software is very expensive to buy outright – by ‘renting’ the software you can have access to tools that would otherwise be a risk investing in.

5) You have greater security. Many people are worried about the potential security risks of cloud computing, but the fact is your data is probably more at risk if you are in control of it yourself. Many service providers also offer guarantees, just in case anything did go wrong.


This post was written by My cool friend Gary Newton - virtualdesktop expert from the UK.

Tuesday, July 24, 2012

Top 5 Cloud Apps for Better Web Development

Web development is a difficult task which requires commitment and strategy to get through the daily operations and emerge successful in the profession. It is a distinct line of work that necessitates individuals to be creative a have a unique perspective unlike other careers. Due to the need to be creative in the development of web content, the developers may encounter various challenges and such include:

1.Having a controlled and environment for working and so that they can exercise their creativity. This is important because it is not easy for people to have a precise determined place where their productivity is at its peak. This therefore means that working from places where they are uncomfortable may lead to slow productivity. On the other hand, there are places that may prove very useful such that there is increase in viable idea generation.

2. Web development also becomes challenging where specific content is required such that arrangement and organization is precise. For example, it can be very difficult at sometimes to orient data to the client's specifications due to user interface problems hence their work is termed substandard.

3. Problems also arise when there is miscommunication and therefore file sharing with relevant teams is broken. This is something that bothers most developers especially when they are on the move (away from a computer) hence delaying sharing of the work. There is also the fact that humans are prone to forget things and especially ideas that only pop up at random times sometimes even in the middle of the night. Instructions also that the client has highlighted, if not noted down appropriately, may be lost and flaw the outcome. In cases where the developers are travelling or are tied up somewhere and need to deliver photo edits urgently, lacking access to a machine may spell disaster. This is because they will only rush things to deliver content that is undesirable to the user and hence lose a prospective client.

On a positive note, these are challenges that can easily be overcome by use of specific applications including
:

1.Pixlr
This is a great app for all developer photo needs as it provides all the tools for editing photos directly on a developer's handheld device. It's a free app.

2. Dropbox
When it comes to storage of important files and sharing, there is no better alternative to this app as it offers great providence of space and user options. This is a free app with subscription for storage.

3. Codeanywhere
This is particularly handy app which offers users the privilege of working on codes within their phone's browser and therefore saves time. The small price is a worthy cause for the efficiency it offers.

4. Gridfox
This is an extension app that allows user to link to their Firefox browser and to get better web page renditions and the convenience of aligning according to their preference. There are no charges for this addon.

5. Evernote
Evernote serves as digital book for all the idea collection needs that the developer has. It allows quick putting down of points in the form of text, audio and images making its very reliable as a web developer handy tool. It is a free app with subscriptions for additional features.


Author Bio:
My friend Biljana is a tech blogger and a writing publishing useful information for  web design and development professionals as well as latest news on virtual machine apps implementation for improved project performance.

Thursday, July 19, 2012

Project Management in Cloud Using Trello: Benefits to Managers in the UK

The cloud remains dominant in terms of software and document distribution, and there are an increasing number of online solutions that enable managers to run their projects in a timely and efficient manner. Trello is a prominent example of this technology, which uses cloud principles to connect business owners and contractors regardless of where they are situated in the world. With this in mind, what are the exact benefits of Trello and how can it allow businesses to operate more proficiently while reducing it’s costs?
  • Real Time Interaction and Document Sharing: In days gone by, sharing documents was an all too complicated process, and the reliance of flawed technology such as fax machines and post often incurred delays that could significantly delay a project’s completion. With Trello, documents and information can be shared in real time, and collaborators can interact live and maximize the efficiency of any individual project.
  • An Innovative and Flexible Online Solution: The purpose of technology is to provide flexible and innovative solutions to problems, and this has been relevant throughout numerous niche sectors and industries. Just as firms such as Ukhost4U have delivered evolving web hosting solutions to clients, so too Trello has emerged as a malleable tool that can be used to house multiple projects simultaneously.
  • Cost Effective Project Management: Given the tough and unforgiving economic climate, it is little wonder that businesses nationwide are striving hard to save money and minimize their monthly costs. Trello allows them to do this effectively, as project managers and coordinators require far less equipment and staff to share documents, distribute tasks and ultimately complete a project in it’s entirety.
Conclusion
Trello is a tremendously purposeful and resourceful online tool, and is continuing to revolutionize project management in the UK. As the above video showcases, it is also simple to use and as open source software also free to access and install for small, medium and large businesses alike.


Saturday, July 14, 2012

How Cloud Computing Has Changed Media Devices

Teachers used to tell their students to get their heads out of the clouds during the busy school day, and employers always dread those employees who seem to have their heads in the clouds as well. The irony comes that due to today's increasingly digital world, the place you want your students and employees to be is "in the cloud". Of course, this refers to the digital cloud of computing. This is the "virtual" storage world where documents, media, movies and photos are all held in a server through a cloud-based computing facility, freeing up storage on media devices as well as allowing near instant access anywhere in the world to the documents and files.

These days it seems everyone is getting in on the "cloud" action. Apple with iTunes, and iCloud. Microsoft with Azure, and MS music (formerly Zune). Amazon with their cloud storage services for music, and pictures. Heck, even Newegg has a cloud service aimed at the consumer level. They entice you with a newegg coupon code for discounted rates to bring you in, and get you hooked. It seems like you can't get away from them. Clouds really are the future of media storage, and consumption. 

What does all this cloud activity adds up to? It means that it has truly changed the game with media devices such as smartphones, iPads and tablets, and laptop computers. One of the biggest changes is the ability to keep and store files in perpetuity without having them bog down or clog up valuable hard-drive space on the media devices. By uploading the old data and files into the cloud, the user gets access to it without having to store it on a personal hard-drive. This allows manufacturers to bring down the size, and weight of a device, which in turn brings down the cost of our favorite gadgets. 

Cloud-based computing has also changed the game for some media devices by usurping what used to be the job of your friendly neighborhood geek pal. This job is now the cloud-based computing service's responsibility. There is no need for complicated and demanding mass storage or networking permissions systems to be set up. With a simple uploading through Wi-Fi or Internet streaming, all data and files are shared and accessed on a variety of home media devices. No more panic attacks when your laptop hard drive crashes, or dreading data/music transfers to a new phone or tablet. Not only is everything within easy reach, via cloud storage, it is at a fraction of a cost of your old set up.

For families and friends, cloud services allow large video and photo transfers to take place without long up- and download times. This allows a mother to take a video or photo of her children with her camera phone, upload it to Facebook or Google+, and share it with Grandma and Grandpa for almost instant viewing. No more waiting for holiday gatherings to share home videos or photos. This brings families and friends together, even if they live across the country or globe. 

The biggest change is for those who love watching movies or TV on their media devices. Cloud systems allow instant streaming of enormous databases of cloud-stored shows and movies. This means access to huge media libraries, without even being home. Watching from home is getting easier too. With new "smart TVs" there is no need for complicated networking to get your PC hooked up to your TV. They come with built in WiFi, so just plug in your network password, and head out to your favorite cloud service like Amazon, or Netflix.  MP3s changed how we get, and listen to our music. I predict the same with happen with the DVD/Blu-ray market. Sony may have won the HD format war with their Blu-ray product, but they are already on their way to being extinct. Why would you buy a DVD/Blu-ray, when you can order a digital copy online, store it in a cloud service, watch it anywhere, and never have to worry about your kids damaging the disk?

These are just a few ways that cloud storage has change how we expect our latest media devices to work. How have cloud services change how you use your media devices?

My friend Eric Cedric is a technology lover and die hard Mac user. He recently switched over to cloud based storage and is using it as he continues his worldwide travels and adventures.

Thursday, June 28, 2012

How Cloud Computing Has Changed TV

Cloud computing sounds like it should be something overwhelming and hard to understand, but nothing could be farther from the truth. The "clouds" are nothing more than large networks of servers working together to provide storage and process data. It's just like your computing network at the office, only large.

Cloud computing has been a long time in the making, only recently gaining popularity among average internet users. The idea of storing our data in cyberspace can feel daunting. When you consider the size of today's multi-media files—music, videos, commercials, webisodes, movies, etc.—cloud computing makes a lot of sense. The size of a Blu-ray movie is three to five gigabytes. It doesn't take too many of them stored on your local hard drive to chew up all your storage.

The real beauty of cloud computing is its portability. No matter where you are, if you can get on the internet you can access all of your files. Not carrying around your DVD collection everywhere you go is no small blessing.

Getting Your TV Fix with Streaming Services

This, of course, is how large video streaming companies operate. It's the only real way they can provide services to so many simultaneous connections. Streaming video giants, like Amazon, Netflix, and Hulu, have helped shape the way we view and use cloud storage, but that hasn't always been the case. It's taken more than a decade for the concept to take root among internet uses even though the technology has been there for much longer.

Netflix, one of the largest streaming video companies in the world, was originally a DVD-only company. That was back in 1999 when the offered subscribers a modest video library of 100,000 titles. It took them about seven years to begin offering streaming video from their cloud network of storage and application servers. Early subscribers received approximately one hour for each dollar of their monthly subscription.

Netflix, the biggest source of internet traffic in North America, moved to unlimited streaming in 2008 and usage shot through the roof. Today, and according to Nielson in June 2011, 42% of Netflix's 26 million subscribers use a stand-alone computer to access its content.

Amazon got into the streaming content action in 2006 with its Amazon Instant Video. This service gave users the ability to delete the large video files from their personal computers after viewing and storing them on Amazon's cloud where they could be accessed any time.

Hulu was just coming into existence as Amazon was rolling out is Video On Demand service, and Netflix was contemplating unlimited access to streaming content. Their approach was unique and has been a huge success as a result.

Hulu negotiated with content owners for the right to re-broadcast their content. The offer movies and television shows from the biggest names in the business, and offer ad-supported content as well as premium content with their Hulu Plus offering.

Portability is the name of the game in cloud computing, and when it comes to very large files, like Blu-ray and HDTV shows, there's nothing better than accessing your entire library movies, or your favorite television shows no matter where you are.

Traditional TV Service Providers Get into the Game

Cloud computing has created a massive marketplace that is just beginning to scratch the surface of its potential offerings. Cable and satellite television providers have become big and important players. While still offering Pay per View content, it is the Video on Demand offerings that have exploded in the past five years, with many service providers offering more than 30,000 unique titles each month.

Premium content providers like HBO, Starz, and Showtime offer free access to subscribers, through their satellite and cable television partners, to their vast selection of uncut movies, original series, and special events like sports and concerts.

Dish Network Steps it Up

DISH Network has taken cloud computing to the next level. Through their new whole-home DVR, the Hopper, subscribers can record certain television shows and have them saved in the DISH Network cloud.

The service is called Primetime Anytime, and it allows subscribers to record several primetime television shows each night on ABC, CBS, Fox, and NBC. These programs, which can be recorded in HD, are stored like On Demand programs and don't take up storage space on the Hopper DVR.

That's just the beginning of how DISH Network has adapted to, and in some instances made unique advances in cloud computing. The other is their "AutoHop" service that subscribers can activate on the Hopper. AutoHop actually skips recording the commercials that air during primetime television programming when subscribers use the Primetime Anytime service. However, if you think content providers with whom DISH Network has licensing deals has taken this lightly, think again.

CBS, Fox, and NBC have individually sued DISH Network for essentially the same things. One of the issues in the lawsuits is the violation of copyrights by changing content they do not own. Another issue is the claim they are destroying the very way broadcast television works. Each network is also citing breach of contract.

DISH Network has countersued in an attempt to have the legality of its service decided by a federal judge. It is possible these cases could eventually wind up being heard by the Supreme Court.

No matter what happens between DISH Network and its AutoHop technology, cloud computing isn't going away. In fact, it's becoming an essential tool for internet users and, as a result, more industries are finding their way into what will surely become the de facto standard in high-capacity storage and computing.

This guest post is written by my dear friend Edwin, a writer and content specialist for USDish.
I am very thankful to him for this great post.

Saturday, June 23, 2012

How cloud computing has empowered outsourcing

From the dawn of traditional outsourcing to the age of cloud computing, the application of technology has enabled firms to save huge amounts of capital since the turn of the century. This trend is unlikely to change, and according to a survey conducted by Vanson Bourne, a significant 93% of businesses believe that cloud computing will continue to drive their venture forward in 2012.

With this in mind, is there still a place for traditional outsourcing organizations in the contemporary business world? Given the continued advancement of cloud computing as a business tool, and the evolving nature of its applicable software, it is difficult to imagine many companies returning to old school IT outsourcing providers at any point in the ear or distant future.

The Cloud vs. Traditional Outsourcing: Why is it Such a One Sided Battle?

The reasons for the dominance of cloud computing over traditional outsourcing methods are similar to those which allowed the latter to become prominent in the first instance, and centre around the growing need of businesses to drive their operational costs down. Just as traditional outsourcing afforded businesses the chance to outsource a growing number of tasks and employ a more flexible, cost effective and project based work force, crowd technology has evolved this principle while making companies even more pronounced financial savings.

More specifically, cloud computing and open source software has exposed the true and often extortionate profit margins that were made by large outsourcing firms. The costs of allocating skilled work to large overseas corporations or independent contractors are far greater than those associated with employing shared services to process work and non strategic tasks, which has in turn provided an ever more efficient and affordable way for firms to operate. With cloud computing software in place, companies can share their resources with collaborators and minimize their outgoings in the process.

Streamlining the Process of Outsourcing: Eliminating the Service Cost

With the traditional outsourcing model, organizations would pay for far more than the simple service that was provided. On top of this was the use of time and resources to fulfill a contractual obligation, which contributed towards a disproportionate and ultimately over sized cost. Cloud computing eliminates this by streamlining the process of outsourcing, with shared resources and flexible infrastructure key to making savings of both time and money. The process gives businesses the tools to become more profitable, which can make a significant difference in the current economic climate.

As anyone who has ever hired a member of staff through an employment agency will testify, the difference between the bare cost of labour and the total fee is sizeable, and often too much for a small or independent business to bear in a depressed economy. This is why the stock of e-recruitment continues to rise, as the ubiquitous nature of online connectivity, social media and cloud computing principles continue to reshape business practice across many different industries. All of these money saving process have a foundation in cloud computing, and the sharing of platforms and informational resources to complete tasks.

Conclusion

So although some may say that traditional outsourcing is now outdated and moribund as a business concept, it is perhaps fairer to suggest that it has simply played its part in a continual and never ending evolution. Cloud computing is simply the next stage of this evolution, and one which makes firms further financial savings while impacting on a significant range of business, IT and sales tasks. So while firms such as UKHost4u can provide relevant web hosting solutions, so too cloud based CRM packages and software can streamline a companies sales operation.

Tuesday, June 19, 2012

Testing In Cloud Computing



Fundamental Questions about Testing in Cloud

Cloud testing and TaaS (testing-as-a-service) are comparatively novel subjects in software testing society even though there are many technological papers published discussing the cloud architectures, technologies, and models, design, and management. From now, test engineers and quality assurance managers often stumble upon many issues and challenges in testing modern clouds and cloud-based applications. Typical questions that come up are as follows:
  • What is cloud testing?
  • What are its special test process and extent, necessities and features?
  • What are different types of cloud testing, environments, and forms do we need to act upon?
  • What are the differences flanked by cloud-based software testing and traditional software testing?
  • What are the unique chuck and diverse features of cloud-based software testing?
  • What are the exceptional issues, and challenges, and needs for testing in cloud?
  • What is TaaS and how is it interconnected to Cloud?
Cloud computing is a model enabling an on demand access to a shared pool of configurable computing assets that can be rapidly provisioned and released with nominal effort. It comes in three key dimensions of service offerings: software-as-a-service (SaaS), platform-as-a service (PaaS), and Infrastructure-as-a service (IaaS). Leisurely a fourth dimension is being added to it in the form of Testing as a Service (TaaS) in the cloud. TaaS is an effort of bringing the benefits of cloud computing to the world of software testing. By leveraging the payback offered by cloud computing, TaaS can help in following ways:
  • Help to cut the outlay of quality in the cloud
  • Decrease the time to create the test environment
  • Reduce full-time resource requirements for testing
  • Diminish test succession time
  • Enhance parallel and intense loading
  • Hence reduce time to move to production. 
To identify with the comparative position of this evolving fourth dimension i.e TaaS, we should to understand the other three magnitude of the cloud offerings i.e SaaS, PaaS and IaaS.
A typical cloud must have several distinct properties: elasticity and scalability, multi-tenancy, self-managed function capabilities, service billing and metering functions, connectivity interfaces and technologies. The highlights of cloud computing can be captured in the following tags:
  • Clouds can be Private, Hybrid, Public 
  • Cloud services can be as SaaS, PaaS, IaaS 
  • Cloud usage can be Enterprise, Community, Open
Cloud Migration: everybody is thinking if not started yet
Cloud computing provides a lucrative and customizable means through which scalable computing power and varied services (such as computer hardware and software assets, networks and computing infrastructures), different application services, business processes to personal intelligence and association are delivered as services to large-scale global users at whatever time and wherever they need.
As a result most of the business in the large and SMB (Small and Medium Business) group have already started or planning to initiate migration to the cloud. The principal derivatives that are being looked at are the cost effectiveness and lack of effort in scaling up and scaling down the infrastructure.

Over to Cloud: What changes for the Testing players?
By tradition people have been working on solo server applications and to ensure the quality of such applications several techniques and approaches were developed and practiced in the end. Now since the migration to the cloud is becoming general practice hence it is important to understand the Do’s, the Don’ts, and the How-To of the cloud migration. To execute cloud-based software testing one need to find out testing and measurement activities on a cloud-based environment and how to pull cloud technologies and solutions. It is important to figure out the relative balance in the labors/tools/assets that need to be employed to make sure that each of the factors heartwarming the end-user experience is sufficiently taken care of without shooting the worth of quality. The objectives of the quality group liable for testing in cloud should now slot in the following points:
  • To guarantee the quality of cloud-based applications deployed in a cloud
  • To confirm the functional services, business processes, and system performance of the application deployed on the cloud
  • To verify the scalability based on system requirements of the cloud based application
  • To validate software as a service (SaaS) in a cloud environment
  • To validate the software performance and security
  • To make sure the provided automatic cloud-based functional services, for example auto-provisioned functions
  • To test cloud compatibility and inter-operation capability between SaaS and applications in a cloud infrastructure, for example, checking the APIs of SaaS and their cloud connectivity to others
Types of Cloud Testing
Cloud Testing can be of following four formats depending on weather you are a cloud vendor:
  • Testing over cloud: It tests cloud-based service applications over clouds, including private, public, and hybrid clouds based on system plane application service necessities and specifications. This usually is performed by the cloud-based application system providers.
  • Testing of cloud: It validates the quality of a cloud from an external view based on the provided cloud specified capabilities and service features. Cloud and SaaS vendors as well as end users are interested in carrying on this type of testing.
  • Testing inside cloud: It checks the quality of a cloud from an internal view based on the internal infrastructures of a cloud and specified cloud capabilities. Only cloud vendors can execute this type of testing as they have accesses to internal infrastructures and associates between its internal SaaS(s) and automatic capabilities, security, management and monitor.
  • Testing  SaaS in cloud: it aims to assure the quality of a SaaS in a cloud for it functional and non-functional requirements.
What is new in Cloud Testing ?
The new features that are required in a cloud based testing environment are principally of the following four types:
  • Cloud-based testing environment/platform:  The scalable environment/platform is a new function compared to the traditional preset, dedicated and pre-configured testing environment.
  • SLAs of the services: In cloud computing, all clouds, SaaS, and applications usually present diverse services to their end users and customers with well-defined service-level-agreement. obviously, these agreements will become a part of testing and quality assurance requirements, such as system reliability, availability, security, and performance agreements.
  • Price models and service billing:  Seeing as utility computing is one of basic concepts and features in cloud computing, as a result price models and utility billing becomes basic parts and service for testing as a service. In other words, required computing assets and infrastructures and testing task services will be charged based on pre-defined price models.
  • Large-scale cloud-based data and traffic simulation:  Applying and simulating big scale online user accesses and traffic data (or messages) in connectivity interfaces is necessary in cloud testing, particularly in system-level function validation and performance testing.
TaaS Details
TaaS is the enabling of the static/dynamic on-demand testing services in/over clouds for the third-parties at any time and all time. One of the primary objectives is to reduce the IT funds of businesses to focus their core businesses by outsource software testing tasks to a third party using TaaS service model. The TaaS workflow can be divided into several sub-tasks which need to be completed to build the TaaS model working. The sub-tasks in the TaaS workflow as follows:
  • TaaS process management, which offers test project management and process control.
  • QoS requirements management, which supports volume keeping and modeling of software testing and QoS requirements, including quality assurance modeling.
  • Test environment service, which provides on-demand test environment services to set up the required virtual (or physical) cloud-based computing assets and infrastructures, as well as the necessary tools.
  • Test solution service, which offers diverse systematic testing solutions (like  test modeling and test methods), and test-ware generation and management services.
  • Test simulation service, which establishes on-demand test simulation environments with selected facilitates and supports the necessary test message generation.
  • On-demand test service, which provides on-demand test execution services based on selected schedules and test wares.
  • Tracking and monitor service, which allows test engineers to track and monitor diverse program behaviors at different levels in/on/over clouds for the testing purpose.
  • TaaS pricing and billing, which enables TaaS vendors to offer customers with selectable testing service contracts based pre-defined pricing models, and billing service.

Monday, June 18, 2012

The “Personal Cloud” Theory

As IT industry become more complex and widespread consultants find it useful to articulate new labels for unfolding novel phenomena. The most recent example is the well-known use of the “personal cloud” buzzword.  It describes what will be displacing what has heretofore been the key focus of computing, which is the deployment of desktop personal computers. In the new epoch of “post personal computer” computing, the Gartner firm has started promoting the theory of the “personal cloud” to describe what is now becoming adopted at a rapid pace as new advance to computing.
There is a drastic difference between desktop computing using stand-alone personal computers and what can be now found in cloud-based networks that will support the highly diverse computing needs of a user population. Differences are technological, economic plus behavioral. Migrating from reliance on desktop personal computers to personal cloud computing represents a major change. It cannot be achieved through small incremental improvements. It requires a refit in the architecture and the organization how systems are designed and then delivered.

The initiative of cloud computing is rooted in the technology of virtualization. The evolution of personal computing was based on voyage of information processing from the central mainframe computer into the hands of individual users through the desktop computer. The personal cloud is reversing this leaning. As the processing power of billions of personal desktop computers expands, the utilization of their capabilities is withdrawing. Only a part of the computer logic is now dedicated to applications through increased access of computer services directly from the Internet. The crushing majority of desktop computing is dedicated to the processing of codes that deal with the operating system, to the manipulation of databases and to the organization of communications. As security vulnerabilities are escalating, much of the power computing is dedicated to security assertion. The computer that is now cabled to sit on the desktop is increasingly inadequate as users shift to mobile computing.

Virtualization of computing makes it achievable to create pools of server-based central computing power that can distribute computing cycles for thousands of individual desktop machines. Virtualization allows for the pooling of data storage to obtain better employment of capacity. Virtualization combines expensive communication service to serve a cluster of virtual computers and so reduces the exposure to security risks. Virtualization detaches local desktop hardware from having to maintain the large overhead of operating systems. Virtualization makes it possible to take lead of economies of scale of the combined capacity of hundred thousands of central servers while enabling fail-overs and preset back-up. Virtualization can deliver specially high levels of service reliability. Though separation of applications the underlying infrastructure, it is possible to create data processing utilities that deliver a standard computing environment that is independent from what the user needs. In this mode, the user gains liberty to process applications using any computing device from any location. The central means for access to computing is not any more the dedicated personal computer, but access to the personal cloud that can be obtainable anywhere, any time and from any device.

The personal cloud makes it possible to gain computing services from consumer-grade devices available at swiftly decreasing competitive prices. Such devices will browser-connect to the cloud without requiring dear systems integration labors. They depend on the network infrastructure, needs only simplified browser software for application services. This arrangement greatly reduces security risks since the user’s smartphone or tablet does not store operational code that is the most important source of vulnerability. All such code is stored on servers, where it can be protected with much greater competence.

The cheap cost of the disposable and rapidly archaic user devices can be matched with the large capital cost of cloud computing centers where all of the processing and communication takes place. The central amenities can be then constructed to have low fall rates because engineering focus can now concentrate on primarily on the delivery of commodity machine cycles. The personal cloud device can be set up for receipt of user charges based on computing usage, on a per use basis.

The characteristic of rich desktop computer locks users into fixed cost economics. The personal computer user must be then fitted into custom-designs that are neither interoperable nor subject to transplantation into another environment. on the contrary, if the cloud is constructed using open source application interfaces the cloud customer can take advantage of competitive offerings from a variety of commercial cloud services, each offering a variety of pricing plans and a assortment of services. Such an arrangement can be also deliberate as a hybrid model.

The owners of personal clouds will have right to use to a range of public and private cloud services. In a typical environment the private cloud will contain proprietary and certainly classified computing services. Depending on security and pricing terms, customers can the shop for applications from a huge collection of ready-to-use applications, usually available for a fixed price, inclusive of maintenance. There will be no cost of integration, whereas the user of a personal computer will have to disburse money for custom-adaptation of every application into a uniquely defined computing architecture.  In the personal cloud it is possible to rapidly swap applications for upgrades provided that the application interfaces are reasonably constructed as open source protocols. In contrast, owning a personal computer will always require incurring costs for making changes for any functionality.

The dependency on personal computing has pushed developers to organize projects that required building increasing complexity and size into any design. Constructing a project, which assures a total interdependency between desktop, laptop, server, database and network computers imposes rising costs of synchronization and integration for the entire effort. Consequently the size of individual projects keeps growing, the implementation schedules are elongating and the gap between original requirements and what is delivered is growing until much that was promised and what is ultimately delivered makes much of the effort obsolete.

The personal cloud avoids the dependency on large projects. For example, the database storage pool can be constructed as a utility service that can have an implementation schedule that can be measured in decades. The server computer processing pool can proceed on schedule that requires only a few years, whereas the assembly of local applications can proceed at a pace of only a few days.

Outline

The last decade has seen a rapid pace of evolution in computing. We are seeing orders of magnitude changes. Firms are moving from reliance on hours of response time to results in minutes and even seconds. Enterprises need to process instantly billions of transactions that require not only internal data but also integration with external sources.

The client-server mode, which depended on billions of desktop computers to assist in the processing of local work, has outgrown its utility. The movement towards the “post personal computer” era has already begun. It places reliance on pooled cloud computing services, which are accessible through a person’s own private cloud. The private cloud can be then defined as access privileges to a collection of cloud services.

In its network figure the personal cloud represents a shift of awareness based on the ownership of assets to the ability to have widespread access to everything that is available. The personal cloud will endow each person with the potential of access to knowledge that hitherto has remained unreachable.

Your Reviews/Queries Are Accepted