Altabel Group's Blog

Archive for the ‘Cloud’ Category

During the annual Health Information and Management Systems Society conference, IBM CEO Ginni Rometty declared that the era of cognitive computing in healthcare is upon us.

“It actually is an era that will play out in front of us, which is what we call the cognitive era,” Rometty said. “I hope to persuade you … that this idea of cognitive healthcare, systems that learn, that this is real and it’s mainstream and it is here and it can change almost everything about healthcare.”

The official IBM website says that IBM Watson Healthcare mission is to empower leaders, advocates and influencers in health through support that helps them achieve remarkable outcomes, accelerate discovery, make essential connections and gain confidence on their path to solving the world’s biggest health challenges.

Let’s look into what IBM Watson is and what exactly it will bring us.

IBM Watson is an advanced artificial intelligence program that is transforming healthcare into a quantifiable service where every bit of information is available and physicians only have to go through their personalized reports instead of reading through dozens of papers for every patient’s case.

Here are just some upgrades that IBM Watson will bring to healthcare.

Your doctor will be well-informed

At the moment one of the most significant challenges in healthcare is the huge amount of information available. Your doctor can not be aware of all the information that has been published recently. Watson however is able to search all the information, so doctors don’t have to spend hours and hours on reading and investigating.

It’s currently being used in genome analysis research at a hospital in the US where it found a third of patients were affected by information published in articles since their treatments began.

You’ll be recommended better treatments

If, for example, you’re diagnosed with cancer, you might benefit from the platform, Watson for Oncology. Usually the doctor meets with cancer patients and spends time reviewing their notes – which would be presented in paper format or in a list of emails. It turns out that A doctor’s decision will be made basing on his individual experience and the information available in front of him.

IBM Watson takes all those unstructured notes and restructures it in a way that the doctor can check easily, with treatment recommendations of which drug to give, which radiation or dosage.

You will be prescribed better medication

A very important aspect of IBM Watson is medication. Generally it takes about 12 years to produce a pill, but recent tests at the Baylor College of medicine in Houston, Texas, has reduced significant parts of the research process to weeks, months, and days. IBM Watson is able to accelerate the discovery of new treatment by streamlining research processes. As a patient, you will benefit from having more appropriate treatments available for you when you need it.

It’s clear that IBM Watson is already transforming healthcare, but much progress still lies ahead.

“We’re just at the beginning of something that will be very big and very transformative over the next 50 years,” said Watson Healthcare Executive Lead, Thomas Balkizas.

Feel free to share your thoughts about IBM Watson prospects for the near future in comments below!

 

yana-khaidukova

Yana Khaidukova

Business Development Manager

E-mail: yana.khaidukova@altabel.com
Skype: yana_altabel
LI Profile: Yana Khaidukova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com


Digital health is dramatically reshaping and redefining how healthcare is delivered. And here are some new trends that we can observe now and which are expected to change the future of eHealth.
 
Distributed Healthcare

New technological aids has changed the relationship between patient and doctor. Patients can now google information about illnesses and treatments, read their digital patient journal online, learn of their doctor’s findings and take responsibility for their own care in a completely different way than in the past.

The use of digital and mobile IT solutions in healthcare means that care is no longer available only in a specific location. Nowadays, patients have the right to choose where they wish to be treated and, in the future, this will not only include choosing which hospital to visit, but also whether to hold their appointments via video link or to treat their depression using online therapy.
 
Smart Devices

Apps and mobile technology are already a natural part of our everyday life.
There is a number of eHealth applications now available and one of them is the digital diary which allows patients to record measurement data and appraisals or to note down their general physical and mental states during the day. As a next step they forward this information to their doctor.

Apps like this also give patients a simple means by which to take greater control over their own well-being, whether related to blood-sugar levels, blood pressure, or mood.
At the moment, healthcare do not use all the rich data that this type of smart device can provide. However, through projects such as the Swedish eHealth Agency’s Health for Me and other platforms that allow patients to collect their health data, an attempt is being made to both understand and find ways to utilize this digital “treasure” for the benefit of both patients and providers.
 
Interoperability

One major feature of eHealth is large IT systems. These are designed to suit a broad user base, however, which invariably makes it difficult for them to cater specifically to any one user. The future lies in creating smaller, customized systems that can communicate with one another through their interoperability. Custom-designed digital solutions entail opening up the market to small-scale actors and utilizing the entire ecosystem during development.
 
Big Data

Big Data has changed the way we manage, analyze and operate data in any industry. Healthcare is obviously one of the most promising areas where Big Data can be applied to make a change. In future perspective healthcare analytics can reduce costs of treatment, predict outbreaks of epidemics, avoid preventable diseases and improve the quality of life in general. Treatment delivery methods face new challenges today: average human lifespan is increasing together with the world population. Healthcare professionals, just like business entrepreneurs, are capable of collecting massive amounts of data and look for best strategies to use these numbers.

Even if healthcare services is not something that exсites you, still you are a potential patient, and just like everyone of us you should be aware about new healthcare analytics applications and how they can help you.
 
Artificial Intelligence

Anytime a new technology enters healthcare, there are a number of challenges it faces. Common setbacks of artificial intelligence in healthcare include a lack of data exchange, regulatory compliance requirements and patient and provider adoption. AI has come across all of these issues, narrowing down the areas in which it can succeed.
The most popular use of artificial intelligence in healthcare is in IBM’s smart cloud, where Watson lives. The Watson platform has been used in a number of disciplines within healthcare including with payers, oncology and patient risk assessment.
 
To know more about the way IBM Watson works and its perspectives for the future please check out my new article “IBM Watson. Future is closer than you think” next week.

 

yana-khaidukova

Yana Khaidukova

Business Development Manager

E-mail: yana.khaidukova@altabel.com
Skype: yana_altabel
LI Profile: Yana Khaidukova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

As The Internet of Things continues to grow, huge amount of data is going to be generated. How huge is the “huge”? Really huge. I do mean that.

Physical devices across the globe are consuming and creating data to drive a continuously connected world. David Booth, CEO at BackOffice Associates believes that currently we are at the tipping point of the Internet of Things. He says, “It was not a big leap for the industry to realize that an IoT global network of continuously connected devices would mean that data would not only be created at geometric rates, but that it would become one of the most valuable commodities in the world.”

Alongside the fact that year 2016 was declared to be the year of the first Zettabyte in internet traffic, Cisco report says the number will reach 2.3 ZB by 2020. Before long we will be transferring this much data annually.

If it does not say anything to you, imagine a byte equals 1 character of text – a zettabyte would cover War And Peace by Leo Tolstoy(which is about 1,250 pages) at least 325 trillion times. Or if 1 gigabyte can store 960 minutes of music – technically a zettabyte would be able to store just over 2 billion years of music. If that still isn’t illustrative enough, let’s measure in cups of coffee. Cisco states that if the 11oz coffee on your desk equals to one gigabyte, a zettabyte would have the same volume as the Great Wall of China. This amount of information is mind-blowing. Zettabyte transformed Big Data into enormously Big Data.
 

The Internet of Things (IoT) is expanding rapidly and relentlessly. And as IoT grows, so do the volumes of data it generates. Ignoring this fact is not an option, and companies will do so at their own peril and risk.

Though there are many new start-up companies storing, analyzing and integrating massive amounts of big data created from the IoT, not many of them have actually considered how the IoT can and will transform organization thinking by implementing data quality and information governance.

With so much data being created, companies must understand what they want to do with it, what are their data requirements and ensure that they have access to the right data. Unless a company can find a way to accumulate, manage and, most important, monetize their data storage, data hoarding can be a real issue for them. Put simply, while the value IoT brings is in the information it creates, innovation gold lies in the filtered data an organization has extracted from the intermediate layer between the devices and the cloud (so called “fog”).

Obviously, data provides powerful potential for boosting analytics efforts. And analyzing the amount of data that is going to be created by the Internet of Things requires new, advanced analytic techniques. The good news is, artificial intelligence and cognitive computing are maturing at a fast pace.

When used properly analytics can help organizations translate IoT’s digital data into knowledge that will contribute to developing new products, offerings, and business models. IoT can provide useful insights into the world outside company walls, and help strategists and decision-makers understand their customers, products, and markets more clearly. It can drive so much more — including opportunities to integrate and automate business processes in ways never imagined before.

Rowan Trollope, Senior Vice President and General Manager of Cisco’s Internet of Things (IoT) and Applications, told participants at the Cisco Live conference, “One of the biggest mistakes you could make now is to underestimate the Internet of Things. This is a life or death issue for most of our customers. They have seen what has happened with Uber and taxi companies and with Netflix and Blockbuster”.

The bottom line is that IoT and Big Data can either disrupt your business or help you become more competitive compared to other businesses that are about to be disrupted.

 

alexandra-presniatsova

Alexandra Presniatsova

Business Development Manager

E-mail: Alex.Presniatsova@altabel.com
Skype: alex.presniatsova
LI Profile: Alexandra Presniatsova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

Introducing ASP.NET Core:

ASP.NET Core is a new open-source and cross-platform framework for building modern cloud based internet connected applications, such as web apps, IoT apps and mobile backends. ASP.NET Core apps can run on .NET Core or on the full .NET Framework. It was architected to provide an optimized development framework for apps that are deployed to the cloud or run on-premises. It consists of modular components with minimal overhead, so you retain flexibility while constructing your solutions. You can develop and run your ASP.NET Core apps cross-platform on Windows, Mac and Linux. ASP.NET Core is open source at GitHub.

The framework is a complete rewrite that unites the previously separate ASP.NET MVC and Web API into a single programming model.

Despite being a new framework, built on a new web stack, it does have a high degree of concept compatibility with ASP.NET MVC.

ASP.NET Platform exists for more than 15 years. In addition, at the time of System.Web creation it contained a large amount of code to support backward compatibility with classic ASP. During this time, the platform has accumulated a sufficient amount of code that is simply no longer needed and is deprecated. Microsoft faced a difficult choice: to abandon backward compatibility, or to announce a new platform. They chose the second option. At the same time, they would have to abandon the existing runtime. Microsoft has always been a company focused on creation and launch on Windows. ASP.NET was no exception. Now the situation has changed: Azure and Linux occupied an important place in the company’s strategy.

The ASP.NET Core is poised to replace ASP.NET in its current form. So should you switch to ASP.NET Core now?

ASP.NET Core is not just a new version. It is a completely new platform, the change of epochs. Switching to ASP.NET Core can bring many benefits: compact code, better performance and scalability. But what price will be paid in return, how much code will have to be rewritten?

.NET Core contains many components, which we are used to deal with. Forget System.Web, Web Forms, Transaction Scope, WPF, Win Forms. They no longer exist. For simple ASP.NET MVC-applications changes will be minor and the migration will be simple. For more complex applications, which use a great number of .NET Framework classes and ASP.NET pipeline situation is more complicated. Something may work and something may not. Some part of the code will have to be rewritten from scratch. Additional problems may be caused by WebApi, because ASP.NET MVC subsystems and WebAPI are now combined. Many libraries and nuget-packages are not ready yet. So, some applications simply will not have a chance to migrate until new versions of the libraries appear.

I think we are waiting for the situation similar to the transition from Web Forms to ASP.NET MVC. ASP.NET Framework will be supported for a long time. First, only a small amount of applications will be developed on ASP.NET Core. Their number will increase, but sooner or later everyone will want to move to ASP.NET Core. We still have many applications running on the Web Forms. However, no one comes to mind to develop a new application on the Web Forms now, everybody chooses MVC. Soon the same happens to ASP.NET Framework, and ASP.NET Core. ASP.NET Core offers more opportunities to meet modern design standards.

The following characteristics best define .NET Core:

  • Flexible deployment: Can be included in your app or installed side-by-side user- or machine-wide.
  • Cross-platform: Runs on Windows, macOS and Linux; can be ported to other OSes (Operating Systems). The supported OSes, CPUs and application scenarios will grow over time, provided by Microsoft, other companies, and individuals.Command-line tools: All product scenarios can be exercised at the command-line.
  • Compatible: .NET Core is compatible with .NET Framework, Xamarin and Mono, via the .NET Standard Library.
  • Open source: The .NET Core platform is open source, using MIT and Apache 2 licenses. Documentation is licensed under CC-BY. .NET Core is a .NET Foundation project.
  • Supported by Microsoft: .NET Core is supported by Microsoft, per .NET Core Support.

The Bad:

  • As for the “cons” one of the biggest issues are gaps in the documentation. Fortunately most of the things for creating and API are covered, but when you’re building an MVC app, you might have problems.
  • Next problem – changes. Even if you find a solution to your problem, it could have been written for a previous version and might not work in the current one. Thanks to open source nature of it, there is also support available on github. But you get same problems there (apart from searching).
  • Another thing is lack of support in the tooling. You can forget about NCrunch or R# Test Runner. Both companies say they will get to it when it gets more stable.
  • ASP.NET Core is still too raw. Many basic things, such as the Data Access, is not designed for 100%. There is no guarantee that the code you are using now will work in the release version.

The Good:

  • It’s modular. You can add and remove features as you need them by managing NuGet packages.
  • It’s also much easier and straightforward to set up.
  • WebApi is now part of the MVC, so you can have class UserController, which will return a view, but also provide a JSON API.
  • It’s cross-platform.
  • It’s open-source.

ASP.NET Core is the work on the bugs of the classic ASP.NET MVC, the ability to start with a clean slate. In addition, Microsoft also aims to become as popular as Ruby and NodeJS among younger developers.
NodeJS and ASP.NET have always been rivals: both – a platform for backend. But in fact, between them, of course, there was no struggle. The new generation of developers, the so-called hipster developers, prefer Ruby and Node. The adult generation, people from the corporate environment, are on the side of .NET and Java. .NET Core is clearly trying to be more youthful, fashionable and popular. So, in future we can expect the .NET Core and NodeJS to be in opposition.

In its advertising campaign, Microsoft is betting on unusual positions for it: high performance, scalability, cross-platform. Do you think that ASP.NET “crawls” on the territory of NodeJS? Please feel free to share your thoughts with us.

Thank you in advance!

Ref: MICHAL DYMEL – DEVBLOG

 

Darya Bertosh

Darya Bertosh

Business Development Manager

E-mail: darya.bertosh@altabel.com
Skype: darya.bertosh
LI Profile: Darya Bertosh

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

The Internet of Things (IoT) includes any form of technology that can connect to the internet: smartphones, TVs, various sensors, robots, fitness and medical equipment, ATMs,  wearables, and much more than this. Just imagine, lawn sensors that tell a sprinkler when a lawn needs to be watered and how much water is needed based on moisture levels; running shoes that clock your pace ─ and notify you when you’ve run so much that it’s time to replace your shoes; and refrigerators that let you know when food products are reaching their expiration date.

The size of the internet of things’ market is immense. According to research firm IDC, the global market was already worth $1.9 trillion last year. And this numbers will grow greatly in the coming years.

Networking and cloud computing are the key factors that make the IoT possible and help to create a special IoT ecosystem.  In fact, with so much data flowing in from potentially millions of different connected objects, the cloud is likely the only platform suitable for filtering, analyzing, storing and accessing all that information in useful ways.  Cloud is accessible from anywhere and from any device. So ,the more devices are  connected, the greater the use of public cloud services will be.

Here are some thoughts in which direction the cloud will be developing next years:

Special-purpose clouds may appear that will focus specifically on connecting devices and machines. So in the coming years, we’ll see increased focus on the software and especially the cloud services to make all sensors connect, process immense volume of data received from the devices, strong analytical tools and systems that generate insights and enable business improvements.

Also we should not forget about information security, privacy and protection. Most consumer IoT services rely on the public cloud as a key enabling technology, where the security of the data cannot be guaranteed.  People will resist the ubiquitous free flow of information if there is no public confidence that it will not cause serious threats to privacy. In the next years we may see the rise of new tools that will prevent information leakage and will provide security to consumers` information in the cloud.

Just a couple years ago cloud computing was just a buzz word and now it plays an important role in the IT world. Nevertheless the IoT global market is still at its infancy, it`s highly probable that in a couple of years IoT will become inseparable part of our lives. It will dramatically change the way we live our daily lives and what information is stored about us. How do you believe the cloud might evolve as the IOT does?

Anna Kozik

Anna Kozik
Anna.Kozik@altabel.com 
Skype ID: kozik_anna
Business Development Manager (LI page)
Altabel Group – Professional Software Development

Microsoft Azure (called Windows Azure before 25 March 2014) is a cloud computing platform and infrastructure, created by Microsoft, for building, deploying and managing applications and services through a global network of Microsoft-managed data centers. It is a growing collection of integrated services – compute, storage, data, networking and app.

It provides both PaaS and IaaS services, which for the general public means a powerful combination of managed and unmanaged services. These services let you build, deploy and manage applications any way you like. Its hybrid cloud solution allows you to store data, backup, recover and build applications in your data center and the public cloud.

With cloud and hybrid services expected to reach US$108 billion by 2017, demand for Microsoft’s cloud products including Microsoft Azure is booming. For now:

  • 57% of Fortune 500 companies are using Microsoft Azure
  • It welcomes 1,000 new customers per day
  • Currently 1.2 million businesses and organizations use Microsoft Azure Active Directory
  • Microsoft Azure gains two times the compute and storage capacity every 6-9 months

What benefits do companies gain from using Microsoft Azure?

Using a cloud computing platform service like Microsoft Azure provides companies with a number of benefits apart from premium storage space and high-performance. The business benefits include:

  • Efficiency – Azure Solutions and Services are known for delivering better-quality products as well as high operational efficiency because of reduced capital costs. Customers and partners can truly realize a huge reduction in total cost of operations and reduced workloads in a small time period.
  • Increased scalability to match demand – as your customer base grows and the usage of your application increases you can just add additional capacity to make sure your application is running smoothly. You don’t have to worry about running out of server capacity.
  • More flexibility and creativity – applications can very quickly be deployed to the Microsoft Azure platform which means that changes can be applied without any downtime. This makes it an ideal platform for your developers to add functionality to your application.
  • Agilitydevelopers would find a host of development tools to take benefit, including automated service management and improved data center presence internationally to reply faster to diverse customer needs.
  • Simplicity – Azure makes use of prevailing development skills in familiar languages such as .Net and even open source languages like Java and PHP to produce and manage applications and services.
  • Trustworthiness – Windows Azure delivers enterprise-class service with consistent service level agreements based on Microsoft’s unbelievable service experience.

Among Azure customers are such companies as HEINEKEN, GE Healthcare, Temenos, Zespri International, 3M, Skanska USA, Xerox, Diebold which speaks for itself 🙂

What position does Microsoft Azure takes up in public cloud?

According to Rightscale releases 2015 state of the cloud report Azure is progressing among enterprises, while Amazon Web Services (AWS) continues to dominate in public cloud with 57 percent of technical professionals saying that they run applications on AWS. That’s up from 54 percent a year earlier.

By comparison, Microsoft Azure’s cloud platform and infrastructure posted a combined score of 19 percent. But Microsoft is making gains, posting a 6 point jump in the number of tech professionals using its cloud infrastructure.

Google’s Cloud Platform offerings came in behind Azure, with 8 percent of survey respondents using Google App Engine, and only 5 percent using Google’s infrastructure products.

Microsoft has put huge amount of work towards marketing Azure to large enterprises, so it’s not surprising to see that large businesses are Microsoft’s core customers. There’s also room for that business to grow: a majority of enterprise users responding to the survey said that less than 20 percent of their company’s app portfolio is in the cloud.

What do you think of Microsoft Azure? What future do you predict for it? Thank you for sharing your thoughts 🙂

Yuliya Tolkach

Yuliya Tolkach
Yulia.Tolkach@altabel.com
Skype ID: yuliya_tolkach
Business Development Manager (LI page)
Altabel Group – Professional Software Development

The infrastructure-as-a-service (IaaS) market has exploded in recent years. Google stepped into the fold of IaaS providers, somewhat under the radar. The Google Cloud Platform is a group of cloud computing tools for developers to build and host web applications.

It started with services such as the Google App Engine and quickly evolved to include many other tools and services. While the Google Cloud Platform was initially met with criticism of its lack of support for some key programming languages, it has added new features and support that make it a contender in the space.

Here’s what you need to know about the Google Cloud Platform.

1. Pricing

Google recently shifted its pricing model to include sustained-use discounts and per-minute billing. Billings starts with a 10-minute minimum and bills per minute for the following time. Sustained-use discounts begin after a particular instance is used for more than 25% of a month. Users receive a discount for each incremental minute used after they reach the 25% mark.

2. Cloud Debugger

The Cloud Debugger gives developers the option to assess and debug code in production. Developers can set a watchpoint on a line of code, and any time a server request hits that line of code, they will get all of the variables and parameters of that code. According to Google blog post, there is no overhead to run it and “when a watchpoint is hit very little noticeable performance impact is seen by your users.”

3. Cloud Trace

Cloud Trace lets you quickly figure out what is causing a performance bottleneck and fix it. The base value add is that it shows you how much time your product is spending processing certain requests. Users can also get a report that compares performances across releases.

4. Cloud Save

The Cloud Save API was announced at the 2014 Google I/O developers conference by Greg DeMichillie, the director of product management on the Google Cloud Platform. Cloud Save is a feature that lets you “save and retrieve per user information.” It also allows cloud-stored data to be synchronized across devices.

5. Hosting

The Cloud Platform offers two hosting options: the App Engine, which is their Platform-as-a-Service and Compute Engine as an Infrastructure-as-a-Service. In the standard App Engine hosting environment, Google manages all of the components outside of your application code.

The Cloud Platform also offers managed VM environments that blend the auto-management of App Engine, with the flexibility of Compute Engine VMs.The managed VM environment also gives users the ability to add third-party frameworks and libraries to their applications.

6. Andromeda

Google Cloud Platform networking tools and services are all based on Andromeda, Google’s network virtualization stack. Having access to the full stack allows Google to create end-to-end solutions without compromising functionality based on available insertion points or existing software.

According to a Google blog post, “Andromeda is a Software Defined Networking (SDN)-based substrate for our network virtualization efforts. It is the orchestration point for provisioning, configuring, and managing virtual networks and in-network packet processing.”

7. Containers

Containers are especially useful in a PaaS situation because they assist in speeding deployment and scaling apps. For those looking for container management in regards to virtualization on the Cloud Platform, Google offers its open source container scheduler known as Kubernetes. Think of it as a Container-as-a-Service solution, providing management for Docker containers.

8. Big Data

The Google Cloud Platform offers a full big data solution, but there are two unique tools for big data processing and analysis on Google Cloud Platform. First, BigQuery allows users to run SQL-like queries on terabytes of data. Plus, you can load your data in bulk directly from your Google Cloud Storage.

The second tool is Google Cloud Dataflow. Also announced at I/O, Google Cloud Dataflow allows you to create, monitor, and glean insights from a data processing pipeline. It evolved from Google’s MapReduce.

9. Maintenance

Google does routine testing and regularly send patches, but it also sets all virtual machines to live migrate away from maintenance as it is being performed.

“Compute Engine automatically migrates your running instance. The migration process will impact guest performance to some degree but your instance remains online throughout the migration process. The exact guest performance impact and duration depend on many factors, but it is expected most applications and workloads will not notice,” the Google developer website said.

VMs can also be set to shut down cleanly and reopen away from the maintenance event.

10. Load balancing

In June, Google announced the Cloud Platform HTTP Load Balancing to balance the traffic of multiple compute instances across different geographic regions.

“It uses network proximity and backend capacity information to optimize the path between your users and your instances, and improves latency by connecting users to the closest Cloud Platform location. If your instances in one region are under heavy load or become unreachable, HTTP load balancing intelligently directs new requests to your available instances in a nearby region,” a Google blog post said.

Taken from TechRepublic

Lina Deveikyte

Lina Deveikyte
Lina.Deveikyte@altabel.com 
Skype ID: lina_deveikyte
Marketing Manager (LI page)
Altabel Group – Professional Software Development


%d bloggers like this: