Altabel Group's Blog

Archive for the ‘Big Data’ Category

During the annual Health Information and Management Systems Society conference, IBM CEO Ginni Rometty declared that the era of cognitive computing in healthcare is upon us.

“It actually is an era that will play out in front of us, which is what we call the cognitive era,” Rometty said. “I hope to persuade you … that this idea of cognitive healthcare, systems that learn, that this is real and it’s mainstream and it is here and it can change almost everything about healthcare.”

The official IBM website says that IBM Watson Healthcare mission is to empower leaders, advocates and influencers in health through support that helps them achieve remarkable outcomes, accelerate discovery, make essential connections and gain confidence on their path to solving the world’s biggest health challenges.

Let’s look into what IBM Watson is and what exactly it will bring us.

IBM Watson is an advanced artificial intelligence program that is transforming healthcare into a quantifiable service where every bit of information is available and physicians only have to go through their personalized reports instead of reading through dozens of papers for every patient’s case.

Here are just some upgrades that IBM Watson will bring to healthcare.

Your doctor will be well-informed

At the moment one of the most significant challenges in healthcare is the huge amount of information available. Your doctor can not be aware of all the information that has been published recently. Watson however is able to search all the information, so doctors don’t have to spend hours and hours on reading and investigating.

It’s currently being used in genome analysis research at a hospital in the US where it found a third of patients were affected by information published in articles since their treatments began.

You’ll be recommended better treatments

If, for example, you’re diagnosed with cancer, you might benefit from the platform, Watson for Oncology. Usually the doctor meets with cancer patients and spends time reviewing their notes – which would be presented in paper format or in a list of emails. It turns out that A doctor’s decision will be made basing on his individual experience and the information available in front of him.

IBM Watson takes all those unstructured notes and restructures it in a way that the doctor can check easily, with treatment recommendations of which drug to give, which radiation or dosage.

You will be prescribed better medication

A very important aspect of IBM Watson is medication. Generally it takes about 12 years to produce a pill, but recent tests at the Baylor College of medicine in Houston, Texas, has reduced significant parts of the research process to weeks, months, and days. IBM Watson is able to accelerate the discovery of new treatment by streamlining research processes. As a patient, you will benefit from having more appropriate treatments available for you when you need it.

It’s clear that IBM Watson is already transforming healthcare, but much progress still lies ahead.

“We’re just at the beginning of something that will be very big and very transformative over the next 50 years,” said Watson Healthcare Executive Lead, Thomas Balkizas.

Feel free to share your thoughts about IBM Watson prospects for the near future in comments below!

 

yana-khaidukova

Yana Khaidukova

Business Development Manager

E-mail: yana.khaidukova@altabel.com
Skype: yana_altabel
LI Profile: Yana Khaidukova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com


Digital health is dramatically reshaping and redefining how healthcare is delivered. And here are some new trends that we can observe now and which are expected to change the future of eHealth.
 
Distributed Healthcare

New technological aids has changed the relationship between patient and doctor. Patients can now google information about illnesses and treatments, read their digital patient journal online, learn of their doctor’s findings and take responsibility for their own care in a completely different way than in the past.

The use of digital and mobile IT solutions in healthcare means that care is no longer available only in a specific location. Nowadays, patients have the right to choose where they wish to be treated and, in the future, this will not only include choosing which hospital to visit, but also whether to hold their appointments via video link or to treat their depression using online therapy.
 
Smart Devices

Apps and mobile technology are already a natural part of our everyday life.
There is a number of eHealth applications now available and one of them is the digital diary which allows patients to record measurement data and appraisals or to note down their general physical and mental states during the day. As a next step they forward this information to their doctor.

Apps like this also give patients a simple means by which to take greater control over their own well-being, whether related to blood-sugar levels, blood pressure, or mood.
At the moment, healthcare do not use all the rich data that this type of smart device can provide. However, through projects such as the Swedish eHealth Agency’s Health for Me and other platforms that allow patients to collect their health data, an attempt is being made to both understand and find ways to utilize this digital “treasure” for the benefit of both patients and providers.
 
Interoperability

One major feature of eHealth is large IT systems. These are designed to suit a broad user base, however, which invariably makes it difficult for them to cater specifically to any one user. The future lies in creating smaller, customized systems that can communicate with one another through their interoperability. Custom-designed digital solutions entail opening up the market to small-scale actors and utilizing the entire ecosystem during development.
 
Big Data

Big Data has changed the way we manage, analyze and operate data in any industry. Healthcare is obviously one of the most promising areas where Big Data can be applied to make a change. In future perspective healthcare analytics can reduce costs of treatment, predict outbreaks of epidemics, avoid preventable diseases and improve the quality of life in general. Treatment delivery methods face new challenges today: average human lifespan is increasing together with the world population. Healthcare professionals, just like business entrepreneurs, are capable of collecting massive amounts of data and look for best strategies to use these numbers.

Even if healthcare services is not something that exсites you, still you are a potential patient, and just like everyone of us you should be aware about new healthcare analytics applications and how they can help you.
 
Artificial Intelligence

Anytime a new technology enters healthcare, there are a number of challenges it faces. Common setbacks of artificial intelligence in healthcare include a lack of data exchange, regulatory compliance requirements and patient and provider adoption. AI has come across all of these issues, narrowing down the areas in which it can succeed.
The most popular use of artificial intelligence in healthcare is in IBM’s smart cloud, where Watson lives. The Watson platform has been used in a number of disciplines within healthcare including with payers, oncology and patient risk assessment.
 
To know more about the way IBM Watson works and its perspectives for the future please check out my new article “IBM Watson. Future is closer than you think” next week.

 

yana-khaidukova

Yana Khaidukova

Business Development Manager

E-mail: yana.khaidukova@altabel.com
Skype: yana_altabel
LI Profile: Yana Khaidukova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

As The Internet of Things continues to grow, huge amount of data is going to be generated. How huge is the “huge”? Really huge. I do mean that.

Physical devices across the globe are consuming and creating data to drive a continuously connected world. David Booth, CEO at BackOffice Associates believes that currently we are at the tipping point of the Internet of Things. He says, “It was not a big leap for the industry to realize that an IoT global network of continuously connected devices would mean that data would not only be created at geometric rates, but that it would become one of the most valuable commodities in the world.”

Alongside the fact that year 2016 was declared to be the year of the first Zettabyte in internet traffic, Cisco report says the number will reach 2.3 ZB by 2020. Before long we will be transferring this much data annually.

If it does not say anything to you, imagine a byte equals 1 character of text – a zettabyte would cover War And Peace by Leo Tolstoy(which is about 1,250 pages) at least 325 trillion times. Or if 1 gigabyte can store 960 minutes of music – technically a zettabyte would be able to store just over 2 billion years of music. If that still isn’t illustrative enough, let’s measure in cups of coffee. Cisco states that if the 11oz coffee on your desk equals to one gigabyte, a zettabyte would have the same volume as the Great Wall of China. This amount of information is mind-blowing. Zettabyte transformed Big Data into enormously Big Data.
 

The Internet of Things (IoT) is expanding rapidly and relentlessly. And as IoT grows, so do the volumes of data it generates. Ignoring this fact is not an option, and companies will do so at their own peril and risk.

Though there are many new start-up companies storing, analyzing and integrating massive amounts of big data created from the IoT, not many of them have actually considered how the IoT can and will transform organization thinking by implementing data quality and information governance.

With so much data being created, companies must understand what they want to do with it, what are their data requirements and ensure that they have access to the right data. Unless a company can find a way to accumulate, manage and, most important, monetize their data storage, data hoarding can be a real issue for them. Put simply, while the value IoT brings is in the information it creates, innovation gold lies in the filtered data an organization has extracted from the intermediate layer between the devices and the cloud (so called “fog”).

Obviously, data provides powerful potential for boosting analytics efforts. And analyzing the amount of data that is going to be created by the Internet of Things requires new, advanced analytic techniques. The good news is, artificial intelligence and cognitive computing are maturing at a fast pace.

When used properly analytics can help organizations translate IoT’s digital data into knowledge that will contribute to developing new products, offerings, and business models. IoT can provide useful insights into the world outside company walls, and help strategists and decision-makers understand their customers, products, and markets more clearly. It can drive so much more — including opportunities to integrate and automate business processes in ways never imagined before.

Rowan Trollope, Senior Vice President and General Manager of Cisco’s Internet of Things (IoT) and Applications, told participants at the Cisco Live conference, “One of the biggest mistakes you could make now is to underestimate the Internet of Things. This is a life or death issue for most of our customers. They have seen what has happened with Uber and taxi companies and with Netflix and Blockbuster”.

The bottom line is that IoT and Big Data can either disrupt your business or help you become more competitive compared to other businesses that are about to be disrupted.

 

alexandra-presniatsova

Alexandra Presniatsova

Business Development Manager

E-mail: Alex.Presniatsova@altabel.com
Skype: alex.presniatsova
LI Profile: Alexandra Presniatsova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

The demand for healthcare services is growing at rapid pace due to constantly increasing number of people with chronic diseases. These days approximately every one of two individuals has one or more chronic diseases, and one of four has two or more chronic conditions. At the same time, there are more medical information today about different diseases and their treatment options than ever before.
 

According to IBM, healthcare data doubles every 2 years. It is also calculated that doctors would have to read 29 hours each workday to keep up with new professional insights. Obviously while dealing with this huge information flow, doctors don’t have enough capacities to decide how appropriate an option might be for a specific patient.

Additionally, the most expensive part of healthcare is the human resources, which adds to the supply-and-demand issues. I guess no one will doubt the fact that professional healthcare is costly.

These insights bring up several questions. How can we benefit from explosion of information in healthcare industry? Is it possible to cut the costs for people who seek healthcare treatment without sacrificing the quality of such services? Or even improving it? How do we find a balance after all?

The answer lies in two words: cognitive computing. It is a system that can handle massive amounts of unstructured data to enable a new class of data interpretation and learning systems. Cognitive systems process information by comparing it to a teaching set of data. So that the more data such a system can analyze, the more it learns, and therefore the more accurate it becomes with the course of time. To mimic the way the human brain works cognitive systems use data mining, pattern recognition and natural language processing.

The main advantage of these machine-learning systems is their ability to find patterns in datasets too large and complex for human brains to embrace. For doctors this means assistance of paramount importance in keeping track of records and making accurate clinical decisions. IDC predicts that by 2018 somewhat 30 percent of healthcare systems will be running cognitive analytics against patient data and real-world evidence to personalize treatment regiments. What’s more, IDC projects that during the same year physicians will tap cognitive solutions for nearly half of cancer patients and, as a result, will reduce costs and mortality rates by 10 percent.

For patients the ability of cognitive computing to act as an advisor and give an additional opinion allows an extra level of assurance in the service provided by the healthcare sector. Eventually the patients will have more confidence in the service they are receiving. Besides, involving cognitive computing into healthcare means availability of remote check-ups, including areas with relatively little healthcare provision. It is predicted that in the U.S., for example, in the nearest future 40% of primary care encounters will be delivered virtually, which will be possible thanks to cognitive systems.

Summing up, cognitive computing can help:

  • Healthcare specialists to manage all the data that is available to make more precise conclusions over the patients’ conditions
  • Patients by advising, and providing answers to the questions they have
  • Decrease costs for healthcare services

As data becomes more complex and diversified, cognitive computing will have an incredible impact on the healthcare industry.

In conclusion, let me give you one single real-life example. Watson (famous IBM cognitive system used to diagnose patients) was able to determine a rare form of leukemia in an old woman, while oncologists at the University of Tokyo had puzzled for about a year over her illness. After analyzing 20 million research papers Watson came up with the proper diagnosis. It took the system no more than ten minutes. Impressive, isn’t it?

 

alexandra-presniatsova

Alexandra Presniatsova

Business Development Manager

E-mail: Alex.Presniatsova@altabel.com
Skype: alex.presniatsova
LI Profile: Alexandra Presniatsova

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

If the experts’ estimates regarding IoT are correct, it means that in 5-10 years there will be more than 50 billion interconnected devices in the world. And they all will generate zettabytes of data, which can be and should be collected, organized and used for various purposes. Hence the tight correlation between IoT and Big Data is hard to ignore, because IoT and Big Data are like Romeo and Juliet – they are created for each other. The unprecedented amount of data produced by IoT would be useless without the analytic power of Big Data. Contrariwise, without the IoT, Big Data would not have the raw materials from which to model solutions that are expected of it.

What are the impacts of IoT on Big Data?

The IoT revolution means that almost every device or facility will have its own IP address and will be interconnected. They are going to generate a huge amount of data, spewing at us from different sides – household appliances, power stations, automobiles, train tracks and shipping containers etc. That’s why the companies will have to update technologies, instruments and business processes in order to be able to cope with such great amount of data, benefit from its analysis and finally gain profit. The influence of Big Data on IoT is obvious and it is conducted by various means. Let’s take a closer look at the Big Data areas impacted by IoT.

Methods and facilities of Data Storage

IoT produces a great and stable flow of data, which hits companies’ data storage. In response to this issue, many companies are shifting from their own storage framework towards the Platform as a Service (PaaS) model. It’s a cloud-based solution, which supports scalability, flexibility, compliance, and an advanced architecture, creating a possibility to store useful IoT data.

There are few options of models in the modern cloud storage: public, private and hybrid. Depending on the specific data nature, the companies should be very accurate while choosing a particular model. For instance, a private model is suitable for the companies who work with extremely sensitive data or with the information which is controlled by the government legislation. In other cases, a public or hybrid option will be a perfect fit.

Changes in Big Data technologies

While collecting the relevant data, companies need to filter out the excessive information and further protect it from getting attacked. It presupposes using highly productive mechanism that comprises particular software and custom protocols. Message Queue Telemetry Transport (MQTT) and Data Distribution Service (DDS) are two of the most widely used protocols. Both of them are able to help thousands of devices with sensors to connect with real-time machine-to-machine networks. MQTT gathers data from numerous devices and puts the data through the IT infrastructure. Otherwise, DDS scatters data across devices.

After receiving the data, the next step is to process and store it. The majority of the companies tend to install Hadoop and Hivi for Big Data storage. However there are some companies which prefer to use NoSQL document databases, as Apache CouchDB and others. Apache CouchDB is even more suitable, because it provides high throughput and very low latency.

Filtering out redundant data

One of the main challenges with Internet of Things is data management. Not all IoT data is relevant. If you don’t identify what data should be transmitted promptly, for how long it should be stored and what should be eliminated, then you could end up with a bulky pile of data which should be analyzed. Executive director of Product Marketing Management at AT&T, Mobeen Khan, says: “Some data just needs to be read and thrown away”.

The survey carried out by ParStream (an analytical platform for IoT) shows that almost 96 % of companies are striving to filter out the excessive data from their devices. Nevertheless only few of them are able to do it efficiently. Why is it happening? Below you can see the statistics, depicting the main problems which most of the companies are facing with the data analysis procedure. The percentage figure points out the percentage of the respondents to the ParStream survey confronting the challenge.

• Data collection difficulties – 36%
• Data is not captured accurately – 25%
• Slowness of data capture – 19%
• Too much data to analyze in a right way – 44%
• Data analyzing and processing means are not developed enough – 50%
• Existing business processes are not adjustable to allow efficient collection – 24%

To perform the action of filtering out the data effectively, organizations will need to update their analysis capabilities and make their IoT data collection process more productive. Cleaning data is a procedure that will become more significant to companies than ever.

Data security challenges

The IoT has made an impact on a security field and caused challenges which can’t be resolved by traditional security systems. Protecting Big Data generated from IoT arouses complications as this data comes from various devices, producing different types of data as well as different protocols.

The equally important issue is that many security specialist lack experience in providing data security for IoT. Particularly, any attack can not only threaten the data but also harm the connected device itself. And here is the dilemma when a huge amount of sensitive information is produced without the pertinent security to protect it.

There are two things that can help to prevent attacks: a multilayered security system and a thorough segmentation of the network. The companies should use software-defined networking (SDN) technologies combined with network identity and access policies for creating a dynamic network fragmentation. SDN-based network segmentation also should be used for point-to-point and point-to-multipoint coding based on the merger of some software-defined networking and public key infrastructure (SDN/PKI). In this case data security mechanisms will be keeping pace with the growth of Big Data in IoT.

IoT requires Big Data

With the emerging of IoT step by step many questions arises: Where is the data coming from IoT going to be stored? How is it going to be sorted out? Where will the analysis be conducted? Obviously, the companies which will be able to cope with these issues the next few years are going to be in prime position for both profits and influence over the evolution of our connected world. The vehicles will become smarter, more able to maintain larger amounts of data and probably able to carry out limited analytics. However as IoT grows and companies grow with IoT, they will have many more challenges to resolve.

What do you think about the evolving of Big Data in IoT? Have you already experienced the challenges of Big Data in IoT? And do you have any ideas about the progressive solutions to these challenges? I’ll be happy to hear your opinion in the comments below. Please, feel free to share your thoughts.

 

Anastasiya Zakharchuk

Anastasiya Zakharchuk

Business Development Manager

E-mail: anastasiya.presnetsova@altabel.com
Skype: azakharchuk1
LI Profile: Anastasiya Zakharchuk

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

BI

When a technical term is used more and more frequently the exact definition becomes “blurred” and its true meaning is usually greatly distorted.

This what happened to the term ‘business intelligence’ or BI. Ever since, when the term had only appeared, the development of technologies has substantially expanded our understanding of BI and of what advantage and benefit the company can retrieve from their available data.

So, what does ‘business intelligence’ mean today? How it could be useful for companies and how to apply its underlying ideas correctly to ensure the steady growth of efficiency and profitability of a business?

What is business intelligence? Why is it important?

BI consists of two completely diverse, but at the same time complementing one another aspects.

  1. Value for the business.

    Implies how companies can use the available information in order to multiply profit and efficiency and bring new products and services to the market successfully.

  2. IT strategy.

    Includes the idea of what technological solutions to apply in order to achieve greatest possible utility of BI.
    Presentation of data in a specific format for efficient usage by the company has always been a challenging task. For many organizations, it is quite complex to determine what particular information is required for a specific use.

Such business analysis requires certainty in methodologies and goals.

Earlier BI resources were limited by the lack of available data collection technologies. Nevertheless, modern technologies such as big data, analytics, mobile services and cloud computing in their combination allow obtaining a continuous flow of detailed information quite fast and with no serious investments.

Still, the current bottom line lies in extracting some valuable sense from these data and, in many respects, it is much more complicated than collecting information itself.

Five efficiency criteria of BI-system (and BI-strategy)

1. While selecting a BI-system one should be guided by the real needs of a particular company

The most common and at the same time the most dangerous mistake is when the BI-systems dictate the strategy of their usage. As a result, the company gets plenty of non-synchronized applications, awkward interface and the infrastructure that is already out of date, yet so entrenched in the IT system that could be barely substituted.

2. Be flexible

Flexible model of the integration of the appropriate software involves constant repetition of certain operations with the gradual development of the system. This allows companies to evaluate the success of the project at any point of time, to determine at what stage it is and towards what it moves.

As a rule, creating, testing and integration of BI-technologies goes much more smoothly when the company receives real-time feedbacks from all the running processes and is able to make required adjustments on the fly. It is vital for BI-systems!

3. User-friendly interface

BI-solutions focus on collection, visualization and management of the data.
Usually, when it comes to large amounts of numeric information companies face a risk to get exceptionally technical, inconvenient and incomprehensible data for the “illiterate” users of the system. This information is highly functional, but impractical, especially when it is badly integrated with other applications.

Integration is a key point in deploying BI-technologies. In case the interface is non-intuitive, complex and inconvenient for the end users, BI-system will definitely work inefficiently.

There is a tendency to allocate significant resources for the integration of the latest technologies promising unprecedented results. However, such investments potentially may do more harm than good. Intelligent, targeted and smooth integration is the key to avoid serious errors during implementation.

4. BI is a tool available to everyone

BI has been long used by completely different users, not only by experts with appropriate education and experience. BI-system should be simple and easy to understand to everyone.

For this purpose, companies have to attain the convenience of analytics and the reports drawn on its basis; it should be simple and demonstrative. The collected data should be presented in the way so that any user could easily make definite conclusions.

5. Centralize your data

The desire to achieve the result, based on useful information implies proper data handling. Receiving data from multiple sources and storing it in a centralized information DB, capable of filtering, sorting and removing the unnecessary is critical for the deployment of the applications involved into making business decisions. Apart from that, risk management also becomes more effective through transparency and structure.

General excitement over BI is evident

The role that IT plays in the world has significantly changed over the past few years thanks to the information ‘boom’. Still, construction of a technological infrastructure is not enough for successful data management.

That is why, ‘business intelligence’ it is not just a fashionable term it is a concept that demonstrates the need to move beyond the paradigm of a separate, isolated existence of data analysis and business goals.

In fact, BI reminds us that technologies and business must be closely linked, so that the business goals and business guidelines predetermine the choice of software and, the software in return would provide useful information leading business to success.

 

Tatyana Ogneva

Business Development Manager

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com

The stumbling block for many companies and the reason why organizations fall behind in the planning and pre-planning stages of big data, appears to be confusion on how best to make big data work for the company and pay off competitively.

With all the talk about rapid deployment and breakneck business change, there can be a tendency to assume that businesses are up and running with new technologies as soon as these technologies emerge from proof of concept and enter a mature and commercialized state. However, the realities of where companies are don’t always reflect this.

Take virtualization. It has been on the scene for over a decade-yet recent research by 451 Research shows that only 51 percent of servers in enterprise data centers around the world are virtualized. Other recent survey data collected by DataCore shows that 80 percent of companies are not using cloud storage, although cloud concepts have also been with us for a number of years.

This situation is no different for big data, as reflected in a Big Data Work Study conducted by IBM’s Institute of Business Value. The study revealed that while 33 percent of large enterprises and 28 percent of mid-sized businesses have big data pilot projects under way, 49 percent of large enterprises and 48 percent of mid-sized businesses are still in big data planning stages, and another 18 percent of large enterprises and 38 percent of mid-sized businesses haven’t yet started big data initiatives.

The good news is that the study also showed that of those organizations actively using big data analytics in their businesses, 63 percent said that the use of information and analytics, including big data, is creating a competitive advantage for their organization–up from 37 percent just two years earlier.

The stumbling block for many and the reason why organizations fall behind in the planning and pre-planning stages of big data, appears to be confusion on how best to make big data work for the company and pay off competitively.

Big data projects need to demonstrate value quickly and be tightly linked to bottom line concerns of the business if big data is to cement itself as a long-term business strategy.

In far too many cases when people plan to build out a complete system and architecture before using a single insight or building even one predictive model to accelerate revenue growth. Everyone anticipates the day when Big Data can become a factory spitting out models that finally divulge all manner of secrets, insights, and profits.

So how do you jump start your big data efforts?

Find big data champions in the end business and business cases that are tightly constructed and offer opportunities where analytics can be quickly put to use.

When Yarra Trams of Melbourne Australia wanted to reduce the amount of repair time in the field for train tracks, it placed Internet sensors over physical track and polled signals from these devices into an analytics program that could assess which areas of track had the most wear, and likely would be in need of repair soon. The program reduced mean time to repair (MTTR) for service crews because it was able to preempt problems from occurring in the first place. Worn track could now be repaired or replaced before it ever became a problem-resulting in better service (and higher satisfaction) for consumers.

Define big data use cases that can either build revenue or contribute to the bottom line.

Santam, the largest short-term insurance provider in South Africa, used big data and advanced analytics to collect data about incoming claims, automatically assessing each one against different factors to help identify patterns of fraud to save millions in fraudulent insurance payments.

Focus on customers

There already is a body of mature big data applications that surround the online customer experience. Companies (especially if they are in retail) can take advantage of this if they team with a strong systems integrator or a big data products purveyor with experience in this area.

Walmart and Amazon analyze customer buying and Web browsing patterns for help in predicting sales volumes, managing inventory and determining pricing.

 

Kristina Kozlova

Marketing Manager

 

altabel

Altabel Group

Professional Software Development

E-mail: contact@altabel.com
www.altabel.com


%d bloggers like this: