Posts Tagged ‘Cloud computing’
Electronic Health Record (EHR) Trends: from wearables and telemedicine to cloud computing and big data
Posted April 11, 2016on:
The new trend for many medical practices is obtaining an EHR (Electronic Health Record) system. While there are many practitioners still using files and travel cards, EHR provides better efficiencies for billing, reimbursements, audits etc. Admittedly, there are more systems then doctors but acquiring an EHR allows better practice efficiencies and perhaps more money for the practice.
In this post we highlighted the most important EHR trends to see unfold this year. Thus, we expect wearables, telemedicine and mobile medicine to continue to advance. They’ll be joined by cloud computing, patient portals and big data.
Telemedicine and wearables plus EHR
The telemedicine market is forecasted to exceed $30 billion in the next five years, as providers increasingly see the need to reach seniors and patients in rural areas. Telemedicine offers tons of value to seniors. It improves care by getting it to remote patients who live far from hospitals. It also enables homebound patients to get high-quality care. It makes care cheaper, and allows seniors to stay at home longer. It benefits providers by making their jobs more flexible. And it also eliminates picking up new illnesses in a clinical care setting.
Wearables’ mass adoption has made store-and-forward telemedicine much easier. Devices like Fitbits automatically collect valuable health data. Store-and-forward telemedicine just means that data goes to a doctor or medical specialist so they can assess it when they have time.
EHRs are going mobile
More and more providers want to provide medical care from their smartphones, and more patients want to access data through mobile devices. Contributing factors to the popularity of mobile devices include their affordability, ease of use and portability (meaning they are easy to carry between patient exams to access electronic patient information). One of the other drivers of mobile technology in healthcare is the availability of myriad apps for smartphones and tablets. For each of the major smartphone operating systems, there is now an app for almost every conceivable healthcare need, ranging from drug dose calculators to fully functioning electronic medical records. Healthcare apps play a pivotal role in changing the utility of mobile devices. They’re transforming smartphones or tablets to medical instruments that capture blood test results, medication information, glucose readings, medical images, enabling physicians and patients to better manage and monitor health information. Healthcare apps are clearly taking on more mainstream health IT functions and have moved beyond sporadic use by early adopters.
From these facts we may conclude that EHRs will offer better mobile design and functionality.
More EHRs will move to the cloud
Start-up costs for EHRs can prove burdensome for some institutions, while cloud-based tools offer minimal start-up costs and can make better use of providers’ current resources. The cloud also enables better continuity of care. Cloud-based software means you can access records from outside the office. It makes mobile access possible. It makes transferring records a snap. And it makes updating software seamless for providers.
In the coming year, more and more EHRs will offer cloud services.
More EHRs will provide patient portals
Though patient portal usage got off to a slow start in 2013, in last two years it grew in popularity.
While about half of physicians offer patient portals right now, almost another fifth of them plan to offer one in the next 12 months. In a 2015 survey of more than 11,000 patients, 237 physicians, and nine payer organizations representing 47 million lives, almost a third of patients said they were interested in using a patient portal to engage with their physician, track their medical history and receive educational materials and patient support.
More providers will both offer and promote patient portals. Some may even have patients use the portals during office visits to begin getting their data into the system. And patients will start to see their value. Educating patients on how and why to use portals will be the key to getting them to use it.
Big data will reveal more connections
Personalized medicine enabled by big data is an emerging trend in healthcare. Innovation will continue apace in 2016.
Personalized medicine focuses on analyzing a person’s genome, environmental, social, biometrical, and religious influencers, and determining a treatment for the individual based on that data. It’s about moving from a one-size-fits-all approach to instead creating micro-buckets of patients by analyzing their medical records and genome sequences, and treating patients based on the research and records of how other patients in similar situations have reacted. Big data is working to identify the behaviors, risk factors, and early indicators of disease so doctors can prevent it more effectively.
Big data is only the first step. That data must be cleaned and structured so it can reveal patterns in factors that influence outcomes.
Moving forward, technology will continue to transform the healthcare industry as it plays a key role in new healthcare delivery models. EMR/EHR, mHealth, telemedicine, and many others identified will continue to increase their footprint in this growing industry. Where do you see Healthcare IT over this year? What EHR trends are you most excited about and what trends did I miss? Let me know in the comments!
The Internet of Things (IoT) includes any form of technology that can connect to the internet: smartphones, TVs, various sensors, robots, fitness and medical equipment, ATMs, wearables, and much more than this. Just imagine, lawn sensors that tell a sprinkler when a lawn needs to be watered and how much water is needed based on moisture levels; running shoes that clock your pace ─ and notify you when you’ve run so much that it’s time to replace your shoes; and refrigerators that let you know when food products are reaching their expiration date.
The size of the internet of things’ market is immense. According to research firm IDC, the global market was already worth $1.9 trillion last year. And this numbers will grow greatly in the coming years.
Networking and cloud computing are the key factors that make the IoT possible and help to create a special IoT ecosystem. In fact, with so much data flowing in from potentially millions of different connected objects, the cloud is likely the only platform suitable for filtering, analyzing, storing and accessing all that information in useful ways. Cloud is accessible from anywhere and from any device. So ,the more devices are connected, the greater the use of public cloud services will be.
Here are some thoughts in which direction the cloud will be developing next years:
Special-purpose clouds may appear that will focus specifically on connecting devices and machines. So in the coming years, we’ll see increased focus on the software and especially the cloud services to make all sensors connect, process immense volume of data received from the devices, strong analytical tools and systems that generate insights and enable business improvements.
Also we should not forget about information security, privacy and protection. Most consumer IoT services rely on the public cloud as a key enabling technology, where the security of the data cannot be guaranteed. People will resist the ubiquitous free flow of information if there is no public confidence that it will not cause serious threats to privacy. In the next years we may see the rise of new tools that will prevent information leakage and will provide security to consumers` information in the cloud.
Just a couple years ago cloud computing was just a buzz word and now it plays an important role in the IT world. Nevertheless the IoT global market is still at its infancy, it`s highly probable that in a couple of years IoT will become inseparable part of our lives. It will dramatically change the way we live our daily lives and what information is stored about us. How do you believe the cloud might evolve as the IOT does?
Microsoft Azure (called Windows Azure before 25 March 2014) is a cloud computing platform and infrastructure, created by Microsoft, for building, deploying and managing applications and services through a global network of Microsoft-managed data centers. It is a growing collection of integrated services – compute, storage, data, networking and app.
It provides both PaaS and IaaS services, which for the general public means a powerful combination of managed and unmanaged services. These services let you build, deploy and manage applications any way you like. Its hybrid cloud solution allows you to store data, backup, recover and build applications in your data center and the public cloud.
With cloud and hybrid services expected to reach US$108 billion by 2017, demand for Microsoft’s cloud products including Microsoft Azure is booming. For now:
- 57% of Fortune 500 companies are using Microsoft Azure
- It welcomes 1,000 new customers per day
- Currently 1.2 million businesses and organizations use Microsoft Azure Active Directory
- Microsoft Azure gains two times the compute and storage capacity every 6-9 months
What benefits do companies gain from using Microsoft Azure?
Using a cloud computing platform service like Microsoft Azure provides companies with a number of benefits apart from premium storage space and high-performance. The business benefits include:
- Efficiency – Azure Solutions and Services are known for delivering better-quality products as well as high operational efficiency because of reduced capital costs. Customers and partners can truly realize a huge reduction in total cost of operations and reduced workloads in a small time period.
- Increased scalability to match demand – as your customer base grows and the usage of your application increases you can just add additional capacity to make sure your application is running smoothly. You don’t have to worry about running out of server capacity.
- More flexibility and creativity – applications can very quickly be deployed to the Microsoft Azure platform which means that changes can be applied without any downtime. This makes it an ideal platform for your developers to add functionality to your application.
- Agility – developers would find a host of development tools to take benefit, including automated service management and improved data center presence internationally to reply faster to diverse customer needs.
- Simplicity – Azure makes use of prevailing development skills in familiar languages such as .Net and even open source languages like Java and PHP to produce and manage applications and services.
- Trustworthiness – Windows Azure delivers enterprise-class service with consistent service level agreements based on Microsoft’s unbelievable service experience.
Among Azure customers are such companies as HEINEKEN, GE Healthcare, Temenos, Zespri International, 3M, Skanska USA, Xerox, Diebold which speaks for itself
What position does Microsoft Azure takes up in public cloud?
According to Rightscale releases 2015 state of the cloud report Azure is progressing among enterprises, while Amazon Web Services (AWS) continues to dominate in public cloud with 57 percent of technical professionals saying that they run applications on AWS. That’s up from 54 percent a year earlier.
By comparison, Microsoft Azure’s cloud platform and infrastructure posted a combined score of 19 percent. But Microsoft is making gains, posting a 6 point jump in the number of tech professionals using its cloud infrastructure.
Google’s Cloud Platform offerings came in behind Azure, with 8 percent of survey respondents using Google App Engine, and only 5 percent using Google’s infrastructure products.
Microsoft has put huge amount of work towards marketing Azure to large enterprises, so it’s not surprising to see that large businesses are Microsoft’s core customers. There’s also room for that business to grow: a majority of enterprise users responding to the survey said that less than 20 percent of their company’s app portfolio is in the cloud.
What do you think of Microsoft Azure? What future do you predict for it? Thank you for sharing your thoughts
The infrastructure-as-a-service (IaaS) market has exploded in recent years. Google stepped into the fold of IaaS providers, somewhat under the radar. The Google Cloud Platform is a group of cloud computing tools for developers to build and host web applications.
It started with services such as the Google App Engine and quickly evolved to include many other tools and services. While the Google Cloud Platform was initially met with criticism of its lack of support for some key programming languages, it has added new features and support that make it a contender in the space.
Here’s what you need to know about the Google Cloud Platform.
Google recently shifted its pricing model to include sustained-use discounts and per-minute billing. Billings starts with a 10-minute minimum and bills per minute for the following time. Sustained-use discounts begin after a particular instance is used for more than 25% of a month. Users receive a discount for each incremental minute used after they reach the 25% mark.
2. Cloud Debugger
The Cloud Debugger gives developers the option to assess and debug code in production. Developers can set a watchpoint on a line of code, and any time a server request hits that line of code, they will get all of the variables and parameters of that code. According to Google blog post, there is no overhead to run it and “when a watchpoint is hit very little noticeable performance impact is seen by your users.”
3. Cloud Trace
Cloud Trace lets you quickly figure out what is causing a performance bottleneck and fix it. The base value add is that it shows you how much time your product is spending processing certain requests. Users can also get a report that compares performances across releases.
4. Cloud Save
The Cloud Save API was announced at the 2014 Google I/O developers conference by Greg DeMichillie, the director of product management on the Google Cloud Platform. Cloud Save is a feature that lets you “save and retrieve per user information.” It also allows cloud-stored data to be synchronized across devices.
The Cloud Platform offers two hosting options: the App Engine, which is their Platform-as-a-Service and Compute Engine as an Infrastructure-as-a-Service. In the standard App Engine hosting environment, Google manages all of the components outside of your application code.
The Cloud Platform also offers managed VM environments that blend the auto-management of App Engine, with the flexibility of Compute Engine VMs.The managed VM environment also gives users the ability to add third-party frameworks and libraries to their applications.
Google Cloud Platform networking tools and services are all based on Andromeda, Google’s network virtualization stack. Having access to the full stack allows Google to create end-to-end solutions without compromising functionality based on available insertion points or existing software.
According to a Google blog post, “Andromeda is a Software Defined Networking (SDN)-based substrate for our network virtualization efforts. It is the orchestration point for provisioning, configuring, and managing virtual networks and in-network packet processing.”
Containers are especially useful in a PaaS situation because they assist in speeding deployment and scaling apps. For those looking for container management in regards to virtualization on the Cloud Platform, Google offers its open source container scheduler known as Kubernetes. Think of it as a Container-as-a-Service solution, providing management for Docker containers.
8. Big Data
The Google Cloud Platform offers a full big data solution, but there are two unique tools for big data processing and analysis on Google Cloud Platform. First, BigQuery allows users to run SQL-like queries on terabytes of data. Plus, you can load your data in bulk directly from your Google Cloud Storage.
The second tool is Google Cloud Dataflow. Also announced at I/O, Google Cloud Dataflow allows you to create, monitor, and glean insights from a data processing pipeline. It evolved from Google’s MapReduce.
Google does routine testing and regularly send patches, but it also sets all virtual machines to live migrate away from maintenance as it is being performed.
“Compute Engine automatically migrates your running instance. The migration process will impact guest performance to some degree but your instance remains online throughout the migration process. The exact guest performance impact and duration depend on many factors, but it is expected most applications and workloads will not notice,” the Google developer website said.
VMs can also be set to shut down cleanly and reopen away from the maintenance event.
10. Load balancing
In June, Google announced the Cloud Platform HTTP Load Balancing to balance the traffic of multiple compute instances across different geographic regions.
“It uses network proximity and backend capacity information to optimize the path between your users and your instances, and improves latency by connecting users to the closest Cloud Platform location. If your instances in one region are under heavy load or become unreachable, HTTP load balancing intelligently directs new requests to your available instances in a nearby region,” a Google blog post said.
Taken from TechRepublic
Software-defined networking (SDN) is a hot, much debated topic and although still in its infancy, it offers the potential to transform how complex networks work. But don’t be fooled into thinking it’s only yet more industry hype, the era of Software Defined Everything is already upon us. Software is being applied to everything from servers, storage, data centres, right through to arguably the most ground-breaking piece of the jigsaw – the Wide Area Network.
SDN changes the way companies build their IT environments by essentially moving the “control plane” of the network away from each individual device in the network to a central controller that works with all the devices, both virtual and physical. This allows for a single controller to configure or manage the complete network, as opposed to each device managing its own functionality and being programmed individually. The technology has huge benefits for businesses, including reducing IT expenditure and enabling changes to the network quickly and easily.
The importance of the network
SDN deployments are still very limited and at their early stages of development. This is due in part to the fact that today’s corporate networks use open standards such as the IP protocol and Ethernet connectivity, but configuring the networks themselves often requires lots of manual tasks because each device on the network has separate policies and consoles. Making significant changes in the network – even with existing hardware – can be time-consuming, potentially taking a week or two. With the move towards server virtualisation and cloud computing, this has become even more complex.
With this in mind, it is no surprise that SDN is making its way to centre stage. SDN is being tackled from all sides of the ecosystem, from virtualisation vendors like VMWare to the traditional networking providers like Cisco. Not only is it going to fundamentally change the business models of the networking and server industries, but it is also going to escalate the importance of the network.
The value that SDN poses for businesses is immense. It holds greater potential for productivity increases from IT than any other development because of the way it acts as a unifying force between disparate elements – computing, networking, virtualisation, information, and business logic. There’s no doubt that SDN will be a disruptive force across cloud, carrier and enterprise networks, likely in that order. The natural progression of turning hardware into software will result in re-architected networks, data centres and infrastructures.
What the future holds
The integration of everything into the network will become a no-brainer in the coming year and this will essentially transform the network into the epicenter of ICT services. While no one can predict the SDN end-game, we are at the cusp of a revolution in the way global networks are designed, built, and managed.
By providing more real-time intelligence and deep application integration SDN is going to enable enterprises to realise innovation earlier with applications rolled out in hours instead of weeks. Organisations will achieve never-before-seen levels of agility while reducing both capital and operational overhead to the lowest levels ever delivered in enterprise solutions.
As a platform, SDN provides the potential to drive the next generation of IT services. Early high visibility adopters like Google and the recent significant increase in VC funding into the SDN area is fuelling momentum and the emergence of the era of Software Defined Everything looks set to change the power of the network for good. Organisations should be looking very seriously at how SDN can benefit their businesses before their competitors get there first.
The early days of video –gaming seems to be gone away. Video games companies offer their game players new graphics and playing options to get what they want and to make better choices.
So Cloud gaming seems to be one of the recent openings and growing trends in the gaming industry. Lately gamers had to choose which game platform to buy: console, PC or portable device. Until now. Thanks to cloud gaming service the gamers can play freely through the cloud on any displays, including TVs, monitors, laptops, tablets, and even smartphones.
But what actually is cloud gaming?
Cloud gaming is a form of online games that uses a cloud provider for streaming. Its means that like all online games whether it is multiplayer games, Xbox or PlayStation cloud games as well need network connection and console to be played. However instead of having a playable copy of the game you download the game itself from the cloud service and stream it instantly.
The main advantages of cloud gaming are:
1. Instantly playable games in your browser. Cloud computing games allows the game to be streamed instantly and be played in a seconds.
2. No need of any installations. All games are stored on a cloud service, so there is no need to download and install them on the hard drive.
3. No specific hardware required. Game content isn’t stored on the user’s machine and game code execution occurs primarily at the server so it allows you to run almost all modern games even on a less powerful computer. Your computer necessarily requires only the ability to play HD-video (720p) and an Internet connection at a speed of 5 Mbit / s with low latency.
The negative effects go beyond the positive benefits and features. So let’s see what they are:
1. The main disadvantage of cloud gaming at the moment is the internet. It requires a reliable and fast internet connection to stream the game and play to your TV or monitor at home. Without a decent connection, it can make games look slow and unplayable.
2. Second hand market. There is a large amount of people who buy second-hand games. Once you completed your title, people generally trade in their old game for a new one. With Cloud gaming, you never own a physical copy making the whole process of trading in your old game for a new one redundant.
Gaikai and onLive
Currently there are two growing cloud projects launched from 2009- 2010 OnLive Game Service and Gaikai,game platforms which breathed new life into video game development.
OnLive is available on different devices: TV consoles, tablets, PCs, Mac OS, smartphones. On the official web site/store www.onlive.com the games could be purchased, rented and be downloaded as a free trial as well.Besides for 100$ you can buy box OnLive Game System, by which cloud game can run even on your TV. And the games also could be played on your tablet or on your smartphone from your PC, Mac or TV via Wireless Controller OnLive for the cost of 50$.
OnLive also provides worldwide interactive playing it means that you could share your playing with other players on the spectating Arena, share your best video moments instantly on Facebook or talk with the players with Voice Chat.
Alternatively, GAIKAI www.gaikai.com, which unlike OnLive, is a cloud-based gaming service that allows users to play high-end PC and console games via the cloud and instantly demo games and applications from a webpage or internet-connected device.Library of games from a service GAIKAI is not too big, but it has a number of popular projects that are not in OnLive, for example: FIFA 12, Bulletstorm, Crysis 2, Dead Space 2, Dragon Age 2 and others.
The benefit of Gaikai’s service is that the company isn’t limited to gaming. The company is actively soliciting streaming partners to utilize Gaikai’s infrastructure, servers, and platform.
On July 2, 2012, Sony Computer Entertainment invested $380 million USD with plans of establishing their own new cloud-based gaming service.
Betting on the future?
Is cloud gaming the future? The media companies like Sony, Gaikai and OnLive think certainly so, as they invest in its development and promotion. At the same time the gamers are still doubtful on the game quality and prefer playing on consoles than on cloud. The main problems/uncertainties that gamers point are mostly connected to the buying habit and staying online playing. The question with the internet connection seemed to be decided with cable providers like AT&T, Verizon, Time Warner, and Comcast that are planning to enter the cloud-gaming space, debuting their services as early as next year.Last thing needs to overcome is the dependence of physical owning.
So maybe if these downsides could be materialized in the benefits it will help to point the biggest skeptics out, and make them believers.
Thank you for your attention and feel free to leave your comments and share your thoughts/experience at this point!
Not long ago I had to answer one very hard question, at least to try finding the answer:-):What is better Windows Azure or Amazon Web Services (AWS) for cloud computing? Why can no other providers fit the bill? – may you ask, the thing is that most of them can’t provide the price points or size that Amazon or Azure can provide. In general , building up the operational capability to provide a service like AWS or Azure is a difficult proposition. Both AWS and Azure provide multiple locations and pay-by-the-hour capability. That’s actually really hard to do without massive capital behind it.
As it`s a long live debate(I mean Amazon vs Azure) it was really hard to find the answer. I`ve consulted with our technical specialists, googled this problem, asked Linkedin and Xing members to share their opinions on this question conducted some polls, etc. As you could imagine, there was no definite answer and a cure-all pill and the opinions differed.
Both AWS and Windows Azure are quite young: AWS “was born ” in 2006 and WA in 2008 however there services are used by the world-known corporations as NASA, Ericsson, Boeing, Xerox, etc and they both are considered to be the leaders of the today`s cloud sphere. The two platforms are very alike and have quite the same characteristics.
Among the two gorillas in the cloud space, some developers prefer Amazon, others think Azure is the best, but often the details are sparse as to why one option is better than the other. Among the reasons for choosing AWS/Azure, and not choosing the other they name cost, available resources, development tools and ecosystem.
Also to a great extent language support matters. AWS is platform agnostic while Azure is windows based. Getting stuck in a single framework like .NET where there is only one “provider” for .NET tools can be a huge hindrance in any future decisions you make as a company.
Microsoft (and Azure as default) seems to be all about lock-in. Lock-in on the operating system, lock-in on the language platform, as well as lock-in on the Azure services. Also, many companies do have to solve big compute problems that Java, unlike .NET, is well positioned for. While many larger companies don’t have to be as concerned with lock-in — this is a very scary thought for most start-ups that need a clearer longer-term cost structure.
Mostly the real choice is between IaaS and PaaS. Azure = PaaS and AWS is special in that it provides both IaaS and PaaS. One of the advantages of the AWS is that from AWS you can get the platform capabilities and the freedom to easily deploy brand new technologies before they become part of the platform. So people need to decide what is more important for them and how important cutting-edge is.
And what do you prefer personally: Amazon Web Services or Windows Azure? Feel free to share your opinions and considerations.