Blog posts tagged in IoT

Posted by on in Thoughts

Gartner’s 10 strategic predictions for 2017 and beyond, makes me unwillingly delve into imagining what the future holds.

As John leaves work and heads to the building lobby, his car is already waiting for him. Self-driving cars are almost mainstream. He just indicates to his car, “Drive me home”. After arriving home, which is already cooled/heated to his preference, he picks up the freshly brewed pot of coffee to pour himself a cup. As he walks into the living room, he says “Play HBO” and the TV turns on with HBO channel playing. Deeply engrossed in the movie, John is suddenly reminded by his virtual assistant (AWS Echo) reminding him about a dinner party scheduled for later in the evening. He tells his virtual assistant to buy some flowers and a good bottle of wine. Using virtual reality, he is immediately present in the virtual mall and able to hand pick these items. As he does a virtual checkout, these selected items are being delivered by a drone to his home in another half an hour and John is all set for the party.

In some time technology will make all of this a reality. Some of it is already a reality though. Let us now look at the technology underlying all of this. At the fundamental level we have Internet of Everything. All devices are connected to the grid all the time. This allowed John’s car to estimate and share his arrival time with devices at home. This in turn allowed his air conditioner to set the appropriate temperature level and coffee maker to brew his preferred coffee beforehand. Almost all the interactions are voice based rather than some clicks on a screen. Devices with audio input will be trained to be activated only on specific person’s voice (biometric audio-based authentication is implicit). Even the acting of purchasing something is not happening on the mobile application anymore. Most of the shopping will be using virtual reality channel and the experience will be most gratifying. No more running to the local store for last minute errands. Deliveries happen by drone in the most efficient manner possible.

Virtual stores of the future will have no physical stores nor warehouses, instead they will rely on JIT inventory from the suppliers directly. Goods will be shipped from the supplier directly to the consumers based on orders received by the virtual stores. The virtual store will completely change shopping experience for its consumers using virtual reality. It will allow consumers to touch and feel objects prior to purchasing theses. Credit transactions will happen transparently in the background based on bio-metric approval from the consumer. The virtual reality googles will perform an IRIS scan to authenticate the consumer and digitally sign the transaction and approve it. Block chain will be used by merchants to maintain these financial transactions in an authentic, non-repudiate-able fashion.

All devices in the home will be connected and share analytics metrics with manufacturers. For example – the air-conditioning/heating unit will share detailed metrics on performance of the compressor, power consumption trends, etc. with its manufacturer. This allows the manufacturer to leverage this data to perform analytics to predict outages and faults well in advance. This in turn ensures that the service technician (possibly a robot) does a home visit before the device breaks down. Preventive maintenance will help continuity and prevent outages. Consumers alongside businesses will help benefit tremendously from this.

Overall life style and experience will change dramatically. People will leverage fitness bands/trackers and share data with their healthcare provider as well as Health Insurance Company. This will enable the healthcare provider to proactively track health of an individual (again through analytics) to detect issues before these arise. Also, insurance companies will base the premium based on the healthiness level of an individual alongside life style patterns. The latter will include diet / food habits (from your virtual store grocery shopping), exercise regime (fitness tracker), etc.

With everything integrated – security is the key. With IoT devices, it is imperative that security is baked in at multiple levels.





Let us look at these in more detail below:


Device security – The device needs to protect itself from attackers and hackers. This includes (but is not limited) to the following: hardening the device at OS level, securing confidential information on the device (data at rest on the device), firewalling the device, etc.


Authentication – Each entity (device, cloud service, edge node/gateway, etc.) needs to authenticate itself to the corresponding entity. If there are default username/passwords in the device, then it needs to enforce password reset on initial power-on (along with factory reset option). Ideally the device should not use static password for authentication. In our earlier post on OTP – based device authentication for improved security we have discussed a novel approach which helps address the challenges faced by IOT device manufacturers today.


You can read more about OTP – based device authentication for improved security by clicking here.


Network communication channel security – Today there are various communication channels at play, for example – devices communicating with their respective cloud service providers, devices communicating with fog/edge computing services/devices, devices interacting with other devices, etc. It is important that each communication channel is secured and there exists trust between the communicating endpoints. The channel can be secured using TLS as appropriate.


Cloud service security – The cloud service provides the backbone for services provided. The attack vector surface needs to be minimal and hardened / firewalled for DDoS attacks. Data from the devices is collected at the cloud service end and needs to be secured (data at rest). This data need not be visible to the cloud service provider as well (depending on the nature of the data and service provided). Provider needs to ensure that appropriate backup and disaster recovery plans are in place. Also, the provider needs to present their business continuity plan to its subscribers. Cloud Security Alliance (CSA) provides good guidance to cloud service providers.


Privacy – This relates more to data sharing across disparate service providers. With IoT, devices will end-up communicating with devices / services from other providers. How much information can be shared across service providers with user content needs to be carved out explicitly? Service providers will need to incentivize users to allow sharing information with other providers. The user needs to benefit from the sharing eventually to allow it.


To summarize security is a key aspect for success of IoT.


Tagged in: IoT security
Last modified on
Hits: 419
Rate this blog entry:

The recent massive distributed denial of service (DDoS) attack on 21st October 2016 affected numerous cloud service providers (Amazon, Twitter, GitHub, Netflix, etc.). It is interesting to note that this attack leveraged hundreds of thousands of internet connected consumer devices (aka IOT devices) which were infected with malware called Mirai. Who would have suspected that the attackers involved were essentially consumer devices such as cameras and DVRs?

A Chinese electronics component manufacturer (Hangzhou Xiongmai Technology) admitted that its hacked products were behind the attack (reference: ComputerWorld). Our observation is that the security vulnerabilities involving weak default passwords in vendor’s products were partly to blame. These vulnerable devices were first infected with Mirai botnet and subsequently these Mirai infected devices launched an assault to disrupt access to popular websites by flooding Dyn, a DNS service provider, with an overwhelming amount of internet traffic. Mirai botnet is capable of launching multiple types of DDoS attacks, including TCP SYN-flooding, UDP flooding, DNS attack, etc. Dyn mentioned in a statement – “we observed 10s of millions of discrete IP addresses associated with the Mirai botnet that were part of the attack” – such is the sheer volume of the attack by leveraging millions of existing IOT devices out there.

Subsequently Xiongmai shared that it had already patched the flaws in its products in September 2015, which ensures that the customers have to change the default username and password when used for the first time. However, products running older versions of the firmware are still vulnerable.

This attack reveals several fundamental problems with IOT devices in the way things stand today:

  • Default username and passwords
  • Easily hackable customer-chosen easy-to-remember (read as “weak”) passwords
  • Challenges with over-the-air (OTA) updates etc.

The first two problems are age old issues and it is surprising to see these come up with newer technologies involving IOT devices as well. Vendors have still not moved away from these traditional techniques of default username and passwords, nor have customers adopted strong passwords. Probably it is time, we simply accept the latter will not happen and remove the onus from customer having to set strong passwords (it is just not going to happen!).

One-time passwords (OTP) can be quite helpful here. One-time password, as the name suggests, is a password that is valid for only one login session. It is a system generated password which is essentially not vulnerable to replay attacks. There are two relevant standards for OTP – HOTP [HMAC-based One-Time Password] and TOTP [Time-based One-Time Password]. Both standards require a shared secret between the device and authentication system along with a moving factor, which is either counter-based (HOTP) or time-based (TOTP).

GS Lab’s OTP-based device authentication system presents a novel approach which helps address the challenges faced by IOT device manufacturers today. It provides unstructured device registry which is flexible enough to include information on various types of devices and an authentication sub-system which caters to authenticating IOT devices tracked in the device registry via OTP. The authentication sub-system is built on top of existing OTP standards (HOTP and TOTP) and helps alleviate the need for static (presumably weak) passwords in IOT devices. It provides support for MQTT and REST protocols which are quite prevalent in the IOT space. More support for additional protocols (like CoAP, etc.) is already planned and in the works. OTP-based device authentication system is built on top of our open source OTP Manager library.

Here are some of the advantages of using GS Lab’s OTP-based device authentication system:

  • Strong passwords – system generated based on shared secret key
  • Not vulnerable to replay attacks – passwords are for one-time use only
  • Freedom from static user-defined passwords
  • Standards based solution – HOTP and TOTP standards
  • Relevant for resource constrained devices – crypto algorithms used by HOTP and TOTP standards work with devices with limited CPU, memory capabilities.
  • Ability to identify malicious devices – rogue devices can be identified using HOTP counter value
  • Provides device registry for simplified management



Last modified on
Hits: 584
Rate this blog entry:

Posted by on in Technology

On a mundane February afternoon, as I headed for lunch, I remember getting a phone
call from within my company, and with it an opportunity to participate in an IoT
training program! Little did I know that the training sessions were supposed to be
on-line, live, interactive but early in the morning. I'm not a morning person, and
was hesitant a little, but somehow, 'curious me' prevailed over 'hesitant me' and
I subscribed. Having heard quite a bit about Internet of Things (IoT), I wanted to
get a taste of it. And this training program presented that opportunity. It not only
talked about learning, but also about making hands dirty to build something!
    Right after the introductory session, it was clear that we could reap the
benefits in a much better way if participated as a team. So, we formed a team with
developers carrying experience in different areas such as UI, server side, native
applications, hardware devices, etc. Then on-wards, we embarked on a journey in a
quest to learn what it means & takes to build an IoT project using an IoT platform.
What follows here is an account of our experiences.

Learning an IoT platform
This was as good as it could get. We got to learn an IoT platform, an Atomic domain
language (TQL that is), ways to integrate with hardware devices, sensors, actuators.
There were well organized set of sessions, which took us on a tour of the platform
and how to use it. The course contained advanced features like clustering, macros
which made it even more 'pragmatic'.

Hands-on is the key, and you get to do plenty of it
One of the best part of this program is : you get to do hands on. In fact, you are
kinda forced to make your hands dirty. I think it's not w/o a reason that the philosophy
of 'learning by doing' exists! We played a lot with raspberry pi, arduino uno, sensors,
actuators and of course TQL system itself. This rendezvous did present us with it's
fare share of issues, but it was all worth.

Technically enriching discussions
One of the reasons for me to subscribe to this training program was to hear about the
IoT platform, directly from the creators of it. It is a big deal!
This was evident from the interactions which we or the community used to have
during as well after the sessions. e.g. Why a particular feature is implemented
in a certain way, why are certain things restricted on the platform, etc. This helped
participants, especially those who were developers/architects, learn about what goes
into making of an IoT platform.

Vibrant support forum
When you open the slack web app for TQL team, you get a random but nice message
to start with. One of the Slack messages that struck the chord with me instantly
was : We're all in this together. This message sums up the kind of support the
Atomiton folks are committed to providing. The questions are answered to depth
with minute details, with the reason explained as well as available alternative/work-around.

Mutually rewarding community
As the participants are required to build projects, they naturally get to showcase it
to the community. This helps everyone understand how the platform can be put to use
to solve real-life problems, how others in the community are using it in an innovative
and creative way, and in much larger context, what IoT is all about.

When you are doing something over and above your regular work, you need high
levels of commitment. And you also need a great deal of motivation!
There was enough of it, at right times, to keep us going. And it rightly came
with tips & suggestions for improvement.

Improvement areas : What can possibly be done to make this even better?

Developer is king!
Developer is the king, and he needs to be pampered. ;) More the developer-friendly
features in the TQL studio, the better it is. Hover-for-help-msg, auto-completion,
templates-at-fingertips (for queries, macros, usage of javascript, in-line-comments)
are some of our suggestions to enhance the TQL studio experience.

Auto-generation of basic queries from models
This will save some work for the developer. Also, it will serve as a guide for
writing custom/complex queries. I would go a step further, and suggest auto-generation
of code for UI : to access data over web-sockets as well as over http.

Highlight security aspects
Make this a must in the training program. Let this be a differentiator.
Following are the aspects which are worth giving a thought :

    • Can h/w devices be given fingerprints (unique identities)?
    • If a web app is being served using app-attachment feature, then how to expose it over https?
    • How to invoke an external service over https?
    • Security in built-in protocol handlers

Hardware bottlenecks

One of the observations our team made after the completion of the final project was :
Working with 'things' is not the same as working with pure software!
We then thought, what would make working with 'things' easier? We realized,
it would be knowledge of setting this h/w up, knowledge of integrating with it,
would make working with it easier. Suggestion here is to make it a child's play.
Crowd-sourcing could well be utilized here. Making this easy and simple would make
participants focus more on the project and utilizing TQL System's features in full glory.
Items to focus here :
Raspberry pi - n/w connectivity, mainly, a list of FAQs with respect
to n/w connectivity, especially, what are the many different ways to do it.
Basic sensors and their connections with Arduino Uno and/or raspberry pi.

A step further, it would be great to share notes on comparison of
off the shelf hardware Vs. specialized high-end hardware. e.g. Raspberry Vs Libelium.
Can Raspberry be used in production environment?

Session prerequisites
It would help if the prerequisites are mentioned for each of the sessions, and the
content is also made available for these prerequisites.
For ex. right from the first session, the participants need to have an understanding
of raspberry pi & Arduino Uno. If they have already gone through it, then the first
session becomes a hello-world purely to TQL system rather than a hello-world to all
of h/w devices and then TQL system.


Tagged in: IoT TQL
Last modified on
Hits: 415
Rate this blog entry:

Posted by on in Technology


Pre Computers Era

This can be termed as ‘pen and paper’ era. It witnessed the building of the foundation.  The concept of numbers became concrete. The zero was invented by Brahmagupta or Aryabhata depending on which way you look at it. The number systems evolved. The earliest known tool used in computation was the Abacus and it is thought to have been invented in 2400 BC.


A number of devices based on mechanical principles were invented to help in computing leading to even analog computers. The computational theories also evolved with the advent of logarithms etc.

Computers Era

The concept of using digital electronics for computing leading to modern computers is recorded around 1931. Alan Turing modelled computation to lead to the well-known Turing Machine. The ENIAC was the first electronic general purpose computer, announced to the public in 1946.


Since then the computers have come a long way. There are super computers.


There are a variety of devices like mainframes, servers, desktops, laptops, mobiles etc. There are specialized hardware like gateways, routers, switches etc. for networking


These enabled the culmination into internet and the World Wide Web as we know it. Storage arrays for all the storage related capabilities including snapshots, backups, archival etc. There are Application Specific Integrated Circuits (ASIC)


so on and so forth.

Software Defined Era

Soon enough this hardware started getting driven by software. The software started getting more and more sophisticated. It evolved over paradigms like multi-tier architecture, loosely couple system, off-host processing etc. There was advent in the area of virtualization


A lot of concepts in computing could be abstracted easily at various levels. This enabled a lot of use cases. E.g. routing-logic moved to software, and hence networks could be reconfigured on the fly enabling migration of servers / devices on response to user / application requirements. The tiered storage can be exposed as a single block store as well as file system store at the same time. It gives capability of laying out the data efficiently in the backend without compromising the ease of its management effectively from a variety of applications.

The cloud started making everything available everywhere for everyone. The concepts like Software Defined Networking (SDN)


Software Defined Storage (SDS)


leading to Software Defined Everything (yes, some people have started coining such a term that you will start seeing widely soon enough). Hardware is getting commoditized. There is specialized software on the rise addressing the needs.

Beyond Software

It is still not clear what will replace software. However some trends and key players have already started to emerge in this direction. There can be a number of components like open source readily available as building blocks. One might have to just put them together for solving the variety of problems without writing much code.  Computing has moved away from “computing devices” into general-purpose common devices like watches, clothing, cars, speakers, even toasters etc. Every device is becoming intelligent. The hardware ecosystem is more or less commoditized already, but software is also along the same path. Witness the proliferation of Openstack


or IoT platforms for example. One might have to simply configure them to address the needs. E.g. Openstack cinder can be configured to clone volumes for creating test-dev environments efficiently. IoT can make a production plant efficient in real time by continuous monitoring, re configuration and management of its resources. It could be Docker containers that one has to only deploy for plug and play to have complete solutions in play. The hand writing recognition, voice commanded devices can lead to complete working solution on a matter of thought! The machine learning can provide already fully functional machines like smart cars etc.

Who knows, a day might come when without doing anything, everything will be achieved even through thin air so to speak! At this time it might sound like a wild stretch of imagination but just quickly reflect over the evolution of computing so far. It might take a really long time to get there. In fact, it might be time for no one making such posts but just a matter of making some Google searches, looking around with open eyes, feeling it with all the senses for everyone to have already grasped the gist of the message!

Last modified on
Hits: 589
Rate this blog entry:
Very low screen size go to mobile site instead

Click Here