Major problems for IoT big-data analytics in cloud environment.

A. IoT knowledge discovery problems or data analytics problems(Value)

• Managing heterogeneous knowledge.

• Transforming the data into knowledge.

• Transforming the knowledge into actions.

• Transforming the actions into cognitive-decisions.

• Tuning IoT big-data to regulate numerous autonomous applications, such as industrial automation.


The networks of trillions of IoT objects constitute a large scale IoT environment, from where huge structured, semi- structured, and unstructured IoT big-data are produced in a real timescale. So data analytics plays a major role in IoT data base management system.

The discussion initiates with the major three major queries.

• Can human create storms?

• Can smart phone understand human?

• How to make IoT smarter?

Three major issues can be considered in a broadcast storms- contention- collision-redundancy. 4G and 5G radio storms supports cloud and IoT virtual reality interactions respectively. A trend was discussed, that migrates from compute world to network (cyber) world and now in cyber physical system (CPS) world. The CPS encompasses IoT, M2M, big-data, 4G, and 5G communication system.

Smarter is an important activity in IoT platform. It includes four broad activities as followings:-

• Apps that can understand human- e.g. recognize human activity, track and manage health.

• Apps that provide Assistance- e.g. to simplify and to log daily behaviors.

• Apps that can recommend- e.g. to suggest dressing, games, and plan trips.

• Apps that help to visualize- e.g. google map related apps for location finding.

Based on the contextual information provided through data in clouds, the apps can be made smarter and intelligence even more than human being.

The following discussion also has made with some visualizations.

• Wearable devices with higher stickiness, reminder, and auto-warning system.

• Wearable localization with particle distributions, OCR angulations, building math model, and finally re-shaped the particle sets to improve the navigational system.

• The learning context, that includes collaborative sensing, crowd sensing, instant encountering, and information convergence.


The overall situated computing manly encompasses following things:-

- Put in a place and condition

- Learn context

- Explicit M2M and collaborations

- Fast develop-able

- Massive connectivity

Also the smarter IoT apps progressive requires 5G mobile band network along with the rapid growth of billions of device connections. Three main components of IoT, such as Big-data, cloud, and software defined network ensure integrately to make IoT smarter.


In a IoT data transformation and inference framework, four activities are mainly to be synchronized within strict timelines for any real time safety-critical applications:

• Large-scale Data collection and normalization from data sources.

• Time-critical knowledge transformations and inference.

• Configuring knowledge dissemination strategy.

• Strategic cognitive decisions and actuation in a real-time basis.


So, How to bind all activities into a common framework so as to meet the real time requirements?

The Response time should be as minimum as possible for any real time system design.

Problem Motivation(in-n/w knowledge inference framework):-

Minimize data flow to external network thus reduce traffic congestion.

Solve sensor's energy problems.

In real time risk driven application, where actuators sort out their actions based on the collected sensing data, the improper knowledge collections may lead to damage of society and social property.

In a safety critical applications ,Actuators can't get proper instruction from large raw data , so data need to transform into knowledge to generate appropriate instructions for actuators to perform intelligent actions.

Processing consumes very less energy, so why not utilize more to minimize large communication energy and in-network data processing is encouraged.

So can we use in-network IoT framework for effective knowledge accumulation ?

Research tells that out of 300000 bytes raw data , the actual filtered knowledge is 8000 bytes, so why to utilize more communication energy by transmitting such raw data.

In-network data processing is highly desired than external processing due to strict time dead lines in safety critical applications.

So requires an in-network knowledge inference framework to solve all problems.

About Author / Additional Info: