The concept of a network of smart devices was
discussed as early as 1982, with a modified Coke machine at Carnegie
Mellon University becoming the first Internet-connected appliance, able
to report its inventory and whether newly loaded drinks were cold. Mark
Weiser’s 1991 paper on ubiquitous computing, “The Computer of the 21st
Century”, as well as academic venues such as UbiComp and PerCom
produced the contemporary vision of IoT. In 1994, Reza Raji described
the concept in IEEE Spectrum as “[moving] small packets of data to
a large set of nodes, so as to integrate and automate everything from
home appliances to entire factories”. Between 1993 and 1997, several
companies proposed solutions like Microsoft’s at Work or Novell’s NEST.
The field gained momentum when Bill Joy envisioned Device to Device
(D2D) communication as part of his “Six Webs” framework, presented at
the World Economic Forum at Davos in 1999.
The term “Internet of
things” was likely coined by Kevin Ashton of Procter & Gamble, later
MIT’s Auto-ID Center, in 1999, though he prefers the phrase “Internet
for things”. At that point, he viewed Radio-frequency identification
(RFID) as essential to the Internet of things, which would allow
computers to manage all individual things.
A research article mentioning
the Internet of things was submitted to the conference for Nordic
Researchers in Logistics, Norway, in June 2002, which was preceded by an
article published in Finnish in January 2002. The implementation
described there was developed by Kary Främling and his team at Helsinki
University of Technology and more closely matches the modern one,
i.e. an information system infrastructure for implementing smart,
connected objects. (Fig. \ref{400571}.)