Did you know? • Ice-lollies were invented when a little boy left a glass of soda water out on a chilly night. • Potato c
170 5 3MB
English Pages  Year 2019
Did you know? • Ice-lollies were invented when a little boy left a glass of soda water out on a chilly night. • Potato c
114 110 2MB Read more
Discover the inventions that have made our world what it is today A great invention opens the door to a new era in huma
298 90 5MB Read more
"For anyone who wants to learn about the rise and decline of Potosí as a city . . . Lane’s book is the ideal place
302 108 9MB Read more
258 55 5MB Read more
292 79 3MB Read more
Table of contents :
3. Blood Types
6. Chewing Gum
7. Chocolate Chip cookie
12. Disposable Diaper
14. EnChroma Glasses
16. Hypodermic Syringe
17. Ice Cream Cone
20. Liquid Paper
22. Microwave Ovens
23. Modern High Heels
30. Pneumatic Tyres
32. Post-it Notes
33. Potato Chips
35. Safety Glasses
37. Sanitary Pad
39. Stainless Steel
41. Synthetic Dyes
44. The Wheel
45. Thermal Inkjet Printer
47. Smallpox Vaccine
50. Vulcanized Rubber
Kimte Guite is an assistant professor in the Department of English at Churachandpur Government College, Manipur. She is also a research scholar on Feminist Theology focusing on Christian Feminism. She lives in Manipur with her husband and three beautiful cats.
Published by Rupa Publications India Pvt. Ltd 2019 7/16, Ansari Road, Daryaganj New Delhi 110002 Copyright © Kimte Guite 2019 Illustrations by Gin Khan Siam The views and opinions expressed in this book are the author’s own and the facts are as reported by her which have been verified to the extent possible, and the publishers are not in any way liable for the same. No part of this publication may be reproduced, transmitted, or stored in a retrieval system, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher. ISBN: 978-93-5333-658-5 First impression 2019 10 9 8 7 6 5 4 3 2 1 The moral right of the author has been asserted. This book is sold subject to the condition that it shall not, by way of trade or otherwise, be lent, resold, hired out, or otherwise circulated, without the publisher’s prior consent, in any form of binding or cover other than that in which it is published.
To Simone, Tintin and Banksy
Contents Introduction 1. Anaesthesia 2. Antiseptic 3. Blood Types 4. Brandy 5. Cement 6. Chewing Gum 7. Chocolate Chip cookie 8. Clocks 9. Coca-Cola 10. Compass 11. Cornflakes 12. Disposable Diaper 13. Dynamite 14. EnChroma Glasses 15. Gunpowder 16. Hypodermic Syringe 17. Ice Cream Cone 18. Pacemaker 19. Kevlar 20. Liquid Paper 21. Matchsticks 22. Microwave Ovens 23. Modern High Heels
24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35. 36. 37. 38. 39. 40. 41. 42. 43. 44. 45. 46. 47. 48. 49. 50. 51.
Monopoly Paper Pencils Penicillin Plastic Play-doh Pneumatic Tyres Popsicles Post-it Notes Potato Chips Radioactivity Safety Glasses Sandwich Sanitary Pad Silk Stainless Steel Superglue Synthetic Dyes Teabags Teflon The Wheel Thermal Inkjet Printer Tofu Smallpox Vaccine Vaseline Velcro Vulcanized Rubber X-rays
Introduction Man possesses an instinctive desire to seek new things. This desire has taken mankind from the invention of the wheel to the printing press and onwards to the World Wide Web. New inventions are made every day to cater to our ever-evolving needs and wants. Looking at the objects around me, I have often wondered how they came into being. The things that I use daily and take for granted must have come from somewhere! They must have been first conceived in someone’s mind. It also made me wonder which of these things were accidentally invented, with the help of a little dumb luck! Much the same way we find a coin on the road! It set me on the path of unearthing just how many of the inventions that exist today came about by accident. To my surprise, there were so many of them! Even my favourite, potato chips were invented by accident! The fifty-one inventions included in this book are those that I find interesting for reasons as varied as the inventions themselves. These inventions include coming up with the concept of doing the same thing in a new and novel way, as we measure time from clocks to watches and to digital clocks on cell phones. These inventions are intended to make our life easier. However, it depends on us as to what use we make of it. We share the responsibility of being conscientious users and not exploit these inventions to cause harm and destruction. Although the inventions here may have been accidental, the determination and perseverance of the inventors to develop and see through the work is something that is purely intentional. They already had the skills required to turn an accident into an invention! The number of experiments that they carried out to perfect their inventions is inspirational. They show us that luck and chance need to be reinforced with hard work to become an achievement. And perhaps, just perhaps, your moment of ‘accident’ is just
around the corner!
Humans have been performing surgeries long before anaesthesia had been
invented. In the eighteenth century, surgeries were performed with much pain and loss of blood in the patients. There were no effective drugs to control the nerve impulses from sensing pain. Herbs and opium were used, but they were not powerful enough. The procedure was usually painful and sometimes, a fatal one. In this outdated practice, patients would be hit on the head, just enough to make them lose consciousness. The surgery would be performed as quickly as possible before they could wake up! In some instances, the unfortunate patients woke up before the surgery was over or some never woke up at all. Ouch! Imagine how painful it must have been! Lucky for us, those days are long past. What is anaesthesia?
Anaesthesia is a gas or the injection of drugs before surgical procedures, which makes the body temporarily lose sensitivity to pain, and at times lose consciousness as well. Substances like nitrous oxide and ether have been popular forms of anaesthesia.
Anaesthesia is an eighteenth century term derived from the Greek word ‘an’, which means ‘without’, and ‘aisthesis’ meaning ‘sensation’. Let’s quickly discuss some of the different types of anaesthesia. General anaesthesia results in the loss of consciousness as it suppresses the activity of the Central Nervous System (CNS). Local anaesthesia is used on a specific part of the body. It blocks the transmission of impulses in the nerves in the region where it is administered. Hence, only a small surface of the body becomes immune to pain. This is commonly used in minor surgeries, such as stitching small cuts. Dentists also use local anaesthesia during tooth extractions and other procedures to prevent pain.
CNS is the nervous system, which transmits sensations of cold, heat and pain to the body. Whom do we thank? In 1799, British chemist Sir Humphry Davy (1778–1829) was working in his laboratory and was studying the behaviour of gases. On a whim, he inhaled
nitrous oxide and found that the gas made him lightheaded and had a strange sensation on his nerves. He believed that the gas had a euphoric and analgesic quality and named it, ‘laughing gas.’ Davy was intrigued by his discovery. He began to experiment it on himself and on his close friends, including the poet, Samuel Taylor Coleridge. He suggested that his ‘laughing gas’ had a numbing effect and could be useful in surgeries to ease pain. However, it would need several decades before his idea became a common practice in medical science.
Inhaling nitrous oxide dulls the nervous system. It was used as an early form of painkiller. Hello, History! On 30 March 1842, an American surgeon and pharmacist, Crawford W. Long, used ether as an anaesthetic in a surgery to remove a tumour from a patient’s neck. He had discovered that ether had the same effect that nitrous oxide had on patients. The experiment was a success, as the patient did not experience any pain during the surgery. Then, in 1845, he administered inhaled ether on his wife, Caroline Swain, during childbirth. He started using the substance during amputation and subsequent surgeries. He also promoted the use of ether. However, Crawford Long did not file a patent, and his results were not documented or published until much later in 1849. He never received his due credit because of the ‘ether controversy’.
The first live demonstration using ether anaesthesia was conducted on 16 October 1846 by William Thomas Green Morton. He painlessly removed a neck tumour from a patient at the Massachusetts General Hospital. Let’s come to the next part of the story William Thomas Green Morton learned the use of ether as an anaesthetic in a lecture at Harvard Medical School. He left medical school and started practice as a dentist. He successfully used ether in a tooth extraction. This gained him publicity and caught the attention of Henry Jacob Bigelow, a professor of surgery at Harvard University. Prof. Bigelow arranged for Morton to give a live demonstration of the use of the substance. After its success, Morton filed a patent for ether, which he labelled as ‘Letheon’. The operating theatre, where Morton made his demonstration, came to be called the ‘Ether Dome’. Two other people who also claimed the patent were Charles T. Jackson and Horace Wells. However, Charles T. Jackson has been known to embroil himself in legal fights over other inventions, like that of the telegraph. The real credit for it went to Samuel Morse but that is a story for another time. Horace Wells was a former associate of William Morton. He first observed the effect of nitrous oxide at a demonstration by Gardner Colton. Realizing the potential of the substance, Wells experimented on himself by inhaling the gas and having a tooth extracted without feeling any pain. He made a public demonstration in 1845, which seemed to have failed. The patient cried out in pain and the audience laughed him off the stage. The patient later admitted that he had not felt any pain nor remembered when his tooth had been extracted but had cried out because he had been nervous! The incident proved too great an embarrassment for Wells, and he gave up the pursuit of anaesthesiology (the study of anaesthesia). The ether controversy The ‘Ether Dome’ demonstration catapulted Morton to fame. However, the medical community criticized Morton for filing the patent. According to them, it was against the ethics (code of conduct) of science and the medical profession. Unfortunately, Morton’s attempt to seek recompense from the
government for his achievement failed as well. Crawford Long came across the December issue of the Medical Examiner. The paper published accounts of the demonstration conducted by Morton at the Massachusetts General Hospital. He then began a long process of documenting his own experiments. He also collected testimonials from patients on whom he had used ether anaesthesia as proof that he had used ether before Morton. It was then that he also came to know of the legal battles between Charles T. Jackson, Horace Wells and William T.G. Morton. It came to be popularly known as the ‘ether controversy’. All’s well that ends well Owing to his public demonstration, William Morton came to be popularly accepted as the first to introduce ether anaesthesia to the general public. Crawford Long’s due credit came a year after his death when the National Eclectic Medical Association gave him a posthumous credit in 1879. In 1864, the American Dental Association honoured Horace Wells for the discovery of modern anaesthesia. Charles Thomas Jackson died on 28 August 1880 at McLean Asylum. Sir Davy was credited for the discovery of the anaesthetic potential of nitrous oxide. However, the first to discover ether anaesthesia is still a matter of debate. However, each of their contributions to science has been a step forward for mankind. To commemorate the historical achievement that changed modern surgery without any preference over the claimants, a statue depicting a Moorish medieval doctor had been erected. The statue was shown supporting a body on his knee and holding a cloth in his hand. It still stands at the Boston Public Garden.
Sir Davy also invented the ‘Davy lamp’ and discovered several elements, including calcium, potassium and sodium.
If you ever undergo surgery, you will now know what kind of gas they used to help you pass out or endure the pain!
What is the first aid you receive when you skin your knee while playing or
cut yourself accidentally? It’s called antiseptic. It comes as an ointment or solution, which you apply directly to the wound to prevent infection. Can you imagine what would happen if such wounds are left untreated? An open wound, no matter how small, can become infected. If left untreated, it could even have serious consequences where the infected area has to be amputated (meaning cut off, in scary medical term) or even result in death! Fortunately for us, we need not be too scared of a cut here and there while playing. Apply antiseptics after cleaning the wound and it will fight off any infection. A slice of history Antiseptics have been used since time immemorial to treat infections even before the causes of infections had been discovered. Herbs and plant extracts were used effectively by the ancient Egyptians to treat wounds and cuts.
However, it was only in the nineteenth century that antiseptics were first put to large-scale use. Joseph Lister (1827–1912) was the first physician to use the first form of modern antibiotics. He was working as a surgeon at the Royal Infirmary in Glasgow. In 1867, Lister was inspired by Louis Pasteur’s ‘germ theory’. During that time, due to the high danger of sepsis and infection, even patients with minor surgeries had a high risk of dying. Lister would wash his tools and dip the bandages and cotton he used for surgery in a solution of phenol and covered the wound with it. He did not know the reason for the effective treatment, although he continued using it as an antiseptic for treating wounds. Paul Ehrlich (1854–1915) first introduced the use of antiseptic in medical science in 1910. He introduced arsphenamine also known as Salvarsan for the treatment of infectious diseases. The German-Jewish physician and scientist discovered that some chemical dyes were able to colour some bacterial cells while others were not affected. He realized that it would be possible to produce substances, which would selectively kill bacteria, without causing any harm to other cells. This was the first working principle of antibiotics. In 1908, Ehrlich won the Nobel Prize in Medicine.
Gram-staining is a method in which bacteria is classified according to how the cell walls respond to staining using a dye. Bacteria can be grampositive or gram-negative. Gram-positive cells retain the colour of the dye while the latter washes out the colour of the dye. Antibiotics are a boon to modern pharmacology, as they provide a quickfix method to tend to wounds. Most importantly, it can be done by anyone with the basic knowledge of first aid. The next time you see a first-aid box, you will be able to identify the antiseptic ointment or solution it contains. Perhaps you can even share the interesting story of its origin with your friends.
Make sure you know which medicine in your first-aid box is an antibiotic. Who knows when you might need it!
3 Blood Types
Do you know what your blood type is? Knowing your blood type or writing
it down in your identity card or keeping the information handy could save your life in case of an emergency. The doctors would know immediately what type of blood should be transfused, should you require one. Besides, there could also be situations where you will be required to donate blood. It will save the time of drawing the blood sample and determining the type of your blood. Why do we need blood anyway? Blood is the essence of life. It consists of red and white blood cells, plasma and platelets. Blood supplies oxygen and essential nutrients to different parts of the body, collects waste and defends the body from infection. ABO groups
It is believed that humans and apes inherited these blood types from an ancestor more than twenty million years ago. The ABO blood group was first discovered by an Austrian American immunologist and pathologist named Karl Landsteiner in 1901. It won him the Nobel Prize. Landsteiner took blood samples from healthy scientists, including himself, and mixed them together in a test tube. He made several mixtures of the samples and observed them. He found that some samples were the same, while some clumped up in the test tube. According to Karl Landsteiner (1868–1943), there are four blood types identified in humans and apes, which are categorized as A, B, AB and O. They are collectively known as ‘ABO blood group’. The blood type of an individual is determined by the antigen found in the Red Blood Corpuscle (RBC). A person with Type A blood has Antigen ‘A’ in their body and produces antibodies that attack B antigens. Therefore, Type A and B cannot donate their blood to each other, as the antibodies in each other’s body will attack the RBC when they enter the body. Landsteiner also discovered that blood types are inherited from parent to offspring.
Antigens are substances that stimulate the immune system to form antibodies. They defend the body from harmful viruses or bacteria. Transfusion In 1907, six years after Landsteiner’s discovery, Reuben Ottenberg, an American doctor, conducted the first successful blood transfusion between two people at Mount Sinai Hospital in New York. Then, in 1916, Oswald Robertson demonstrated that refrigerated blood could be used in transfusion. He also discovered that people with Type O blood can donate blood to any of the four ABO blood types. They are, therefore, called ‘Universal Donors’. Those with AB type blood can receive from any blood type but donate only
to AB type. Hence, they are called ‘Universal Recipients’.
In the seventeenth century, Jean-Baptiste Denis was said to have transferred a small count of sheep’s blood into a boy! Now, here’s a simple task for you: Find your blood type. Then read this chart and discuss your blood types with your friends.
Who among your friends can you give blood to and whom can you receive blood from?
Brandy is an alcoholic drink, which is derived from fruits such as apple,
grape, apricot, blackberry, etc. The fruits are fermented and distilled to produce an aromatic beverage. The taste of the brandy is also determined by the region where the fruits are grown and varies from one region to the other. The distillation of wine was carried out as a means of preservation and in order to ease the transport. Another very important reason was to lessen the tax, which was measured in terms of volume. The legend It was believed that in the sixteenth century AD, a Dutch trader removed the water content as a means to transport more wine and lessen the tax. On reaching his destination, he discovered that the wines stored in the wooden casks tasted better than the ordinary distilled wine. That is how the drink we
know as ‘brandy’ was born. The name is derived from the Dutch term ‘bradjwin’, which means ‘burned wine’. Purity test Despite its humble origins, brandy rapidly gained popularity, and the distillation process was adapted and improved to obtain optimum taste. As most beverages can be adulterated (mixed with other substances), winemakers have come up with ways to test the purity. One method involves setting a small portion of the liquor on fire. If the fire consumes all of it, without leaving any residue, it means that the wine is pure. Another method involves placing gunpowder at the bottom of the spirit and setting it alight. If the gunpowder ignites after the fire has consumed the liquor, it has passed the purity test. Preparation The 1728 edition of Cyclopaedia describes the process of distilling brandy. The grapes used for making brandy are plucked early so that they are more acidic and have low sugar content. In the first phase, a large part of water and solids are removed from the base to obtain ‘low wine’. In the second stage, the concentrated low wine is distilled to obtain the brandy. In the third stage, the distilled brandy exits the container in three phases, which are known as the ‘head’, ‘heart’, and ‘tail’, respectively. The head and tail portion are extracted together and then reused along with the low wine for distillation. The most prized portion is the ‘heart’, which has a rich aroma and is extracted and preserved. After distillation, the newly-extracted brandy is placed in oak barrels and left to mature. Some brandies are aged in single casks while others, particularly brandy from Spain, are aged using the solera system, where the barrels used for storing the wine are changed each year and the duration of aging varies according to the requirements of the producers. The finest brandies are produced in the town of Cognac, and only those brandies produced in the region of this town are authentically named ‘Cognac’. Brandy follows the traditional age-grading system, where the grades are given in correspondence to the number of years they have been aged in the casks. Normally served at room temperature, they can be used to make
cocktails. It also has culinary uses as a deglazing liquid when cooking meat or in baking. Although brandy made its appearance rather late in the sixteenth century, it has become an important part of the drink menus in restaurants and bars, for its flavour and for creating cocktails.
‘Deglazing’ is the term used in cooking when the food residue stuck to the pan are removed and mixed to the food to enhance its flavour.
A Word of Caution: Drinking is injurious to health. But knowing the interesting history never hurt anyone!
If you look around, the homes that we live in, the schools that we go to, the
pavements that we walk on, etc., have one thing in common. All of them are made of concrete, which contain cement. Concrete? Cement? Aren’t they the same thing? Well, not quite. Although they are closely related and often used interchangeably, concrete and cement are two entirely different things. Before we dig through the rubbles of its history, let’s clarify the difference between the two. Concrete is used as building materials in construction, which contains aggregates and paste. The aggregates used are composed of gravel or sand, which is mixed with the paste. The paste is made up of a mixture of water and cement. Cement is a mixture of limestone and clay. Hence, all concrete are made from cement but cement does not contain concrete!
The Roman recipe There is a very interesting backstory to the invention of cement, and it all begins with a thief and a con man. Evidence of a crude form of concrete was found in the floors of the sewers in Crete, which used clay and ‘volcanic ash’. Despite the tragedy of its origin, the volcanic ash helped the Romans create cement, which would form the strongest concrete in the world. The word ‘concrete’ comes from the Latin word ‘concretus’, which means ‘to grow together’. Several components combine to form the building block. However, the Romans used the term ‘cementis’, which means ‘rocky stuff’, to refer to concrete.
Volcanic ash is also known as pozzolana, which comes from Pozzuoli, the site of Mount Vesuvius. The volcanic mountain erupted in 79 AD and destroyed the Roman city, Pompeii. The volcanic ashes fell on the city of Pompeii, and the ash-covered bodies were preserved like statues frozen in time. The Solid History The wealth and greatness of a civilization is measured by the things they leave behind. The Romans have left plenty to remember them by. The winding roads and grandeur of the Colosseum or the majestic splendour of the Pantheon are proof of the enduring architecture of the Roman civilization. They used pozzolana, mixed with sand and ground-up rocks, for construction. Emperor Flavian Vespasian ordered the construction of the Flavian Amphitheatre, which could hold more than 50,000 people. Can you guess what the structure is called today? The Colosseum! The Colosseum was also made using a combination of bricks. The thick walls of the Pantheon built by Emperor Hadrian are also made of Roman concrete or cementis. It is covered
on both sides with bricks. The structure was built to last as long as the immortal gods to whom it was dedicated. Unfortunately, all records and details of Roman architecture were lost when the empire fell in 476 AD. With it, the Roman recipe for concrete was lost forever except for the treatise of Vitruvius, On Architecture. This also remained a puzzle until Giovanni Giacondo solved it almost a thousand years later. The comeback In the sixteenth century, a bricklayer in Germany acted on a whim and mixed ‘trass’ and limestone. Trass is also a kind of volcanic ash. He learned that the mixture was much stronger and water-resistant. By this time, trass was discovered in Andernach, in Germany, and it was similar to the lost recipe of Pozzolana. In the mid-eighteenth century, John Smeaton, a civil engineer, was commissioned to build a lighthouse on the Eddystone Rocks. This was a rocky and dangerous seacoast. However, Smeaton was up to the task! He came up with a component for concrete, which would be stronger than any used before. He experimented with limestone taken from the English town of Aberthaw and found that the limestone had a high clay content, which we now know as natural cement. More than a century later, it was disassembled and rebuilt in Plymouth. The reason was that the rocks on which the lighthouse stood were beginning to erode, but the concrete was still intact!
Smeaton is known as the ‘Father of Civil Engineering’. The cement con Joseph Aspdin (1778–1855) was a nineteenth-century bricklayer in England. He was a bit of a crook and was fined twice for stealing limestone bricks
from the paved roads in Leeds. He used the bricks for his experiments, which resulted in a cement mixture he named ‘Portland cement’. The story then moves on to his son, William Aspdin. William was also a bit like his father and he experimented with ‘clinker’. He collected the clinkers from cement factories. One day, he crushed the clinker till it turned into powder. He then mixed the clinker with other cement ingredients. The result was a success! William Aspdin had invented a new type of cement twice as strong as the ‘Roman cement’ invented by the Romans.
‘Portland cement’ is named after the Isle of Portland, which has an abundance of limestone. However, William found it difficult to sell his product. So, he claimed that his cement was made from the same recipe invented by his father. It was said that in order to hide the recipe of his ‘Portland cement’ from competing firms, he would place different chemicals in his factory easily visible to everyone. This was a clever trick to keep his recipe a secret. It is ironic, however, that William should stumble upon such a unique invention, and yet, given his lack of integrity, his business partnerships turned out to be catastrophic failures. The accidental genius, William Aspdin, died in 1864 at the age of forty-eight. His recipe is still used to produce Portland cement to this day.
Powdered limestone is mixed with clay and water. The paste then is baked
in a furnace till it solidifies and then crushed into cement powder. ‘Clinker’ is the material, which is rejected and thrown away when the paste is baked for too long. Setting the bar In 1880, an engineer named Ernest Ransome (1852–1917) tried bonding iron rods with concrete. He found that they bonded perfectly. The iron bars could even be twisted to suit the desired shape of the concrete. This process is known as ‘reinforcing bar’ or ‘rebar’. This technique paved the way for modern architecture, which was improved upon by later engineers and architects. However, they found that the iron bars would rust over time as the concrete erodes when in contact with seawater. The Roman concrete proved far superior, as the concrete, apart from being waterproof, seemed to strengthen when in contact with seawater. Modern concrete still takes second place in comparison to the ancient masters.
Find out where and what kind of cement is used for constructing your house or local buildings.
6 Chewing Gum
The practice of munching chewing gum-like substances is an old tradition.
The ancient Greeks were known to chew ‘mastiche’ extracted from the resin of the mastic tree. The Mayans chewed ‘chicle’ from the resin of a tree in the Americas. The American settlers also picked up the habit of chewing the sap of spruce trees and beeswax from the Indians and made gums from them. In 1848, John Curtis sold the first commercial gum extracted from the spruce sap. Two years later, he sold paraffin-sweetened gum, which was quite popular among the consumers. O my gum! The credit for the invention of the chewing gum goes to Thomas Adams, a resident of New York City. Adams was working as a secretary for Antonio
Lopez de Santa Anna, the president of Mexico in the 1850s. The president was on exile in Staten Island. Santa Anna was very fond of chewing the gum of the Manilkara tree, also known as chicle. He suggested to Adams began to experiment with chicle, to make synthetic rubber tyres. The suggestion was a welcome one, as rubber was expensive at the time and hard to come by. Adams began to experiment with chicle by attempting to make boots, toys, masks, etc. However, all his experiments failed. Disheartened and frustrated, Adams was at a loss as to what his next step should be. In a moment of distraction, he popped a chicle gum into his mouth and began chewing it. To his surprise, he found that the gum was not bad at all! Adams then hit upon an idea. Instead of making synthetic rubber, why couldn’t he make chewing gums and sell them? Adams rolled the gum into tiny balls and wrapped them in colourful tissues to make them look more appealing. He sold them with the label ‘Adams New York Snapping and Stretching Gum’. He approached drugstores and requested them to store his gums, and within a few days, the demand for gums poured in on an unprecedented scale. Adams even designed a manufacturing machine, which he patented in 1871. Adams’s chewing gums came in beautiful wrappers with a picture of New York’s City Hall. American Chicle Company In 1888, Adams Gum Company introduced the first vending machines (automatic machine for selling things) in the USA. He installed them in subway stations, in New York. He also introduced another flavour called ‘tutti-frutti’. By 1899, Adams Gum Company controlled the gum business. The six largest manufactures, like W.J. White and Son, Beeman Chemical Company, Kisme Gum and S.T. Briton, merged with Adams Gum Company under the banner ‘American Chicle Company’. A candy salesman came up with the idea of wrapping chicle in candy shells and sold them as ‘chiclets’, which became part of the American Chicle Company in 1914. As the world entered into two destructive wars within the span of a decade, the demand for chewing gum was so high that there was not enough raw materials to meet the demand. The gum legacy
Although Adams made a fortune from chicle chewing gum, Santa Anna, the man who introduced him to chicle, made no profit from it. Thomas Adams, the lucky inventor of the modern chewing gum died on 7 February 1905 in New York. In 1962, American Chicle Company was sold off. It was renamed ‘Adams’, in honour of its pioneer. Currently, the company is owned by Cadburys. The success story of Thomas Adams is not just about a stroke of luck and happy accidents, it is a story of a man who refused to give up despite repeated failures and finally managed to turn those failures into success.
So, the next time you are caught in a sticky situation, don’t give up!
7 Chocolate Chip Cookie
What’s your favourite cookie? Do you nibble on it or take a big bite?
Dunking the cookie into a glass of milk is fun too! It’s the perfect snack for a quick break from playing or homework. Lucky for us, it comes in so many flavours! But nothing beats the taste of the good old chocolate chip cookie. Won’t it be nice if we could have a house made of cookies, like the cake house Hansel and Gretel found in the woods? Yep! They did. You can find out more about the story of Hansel and Gretel later, but let’s get back to our delicious topic. So, where did this delicious cookie come from? Let’s get to the bottom of this cookie jar. Blood, ‘sweet’ and tears The chocolate chip cookie was invented by Ruth Graves Wakefield. There are two versions of how the chocolate chip cookie was invented. The more famous one describes how Ruth added bits of Nestle semi-sweet chocolate to the cookie dough by accident because she had run out of melted chocolate.
removed Mrs Wakefield’s name from the product. The chocolate chip cookie is still one of the most delicious cookie ever invented. On days when we feel low, there is very little that a warm glass of milk and a chocolate cookie cannot fix.
The world’s largest chocolate chip cookie, in Flat Rock, NC, weighed about 18,144 kg, was 102 feet in diameter and contained 30,000 eggs!
It would have been very difficult to do even the most ordinary things if there
were no common standard of measuring time. What time would our classes begin? When would the recess bell ring? We could use the sun! But what would we do on cloudy days? That is why we use clocks—an instrument that helps us measure and indicate time. The sundial was used in the medieval times. It is used to indicate the time through the shadows cast by the passing sun on an object placed on a smooth surface. The hourglass and water clocks are other varieties of time-measuring devices. Time before Big Ben Was there ever a period before the concept of a day with twenty-four hours? How did people maintain their schedule over the course of a day then? Around the second century BC, Greek astronomer Andronicus supervised the
construction of the ‘Tower of the Winds’ in Athens. It was used as an ancient timepiece. In 797, Harun-al-Rashid of Baghdad was said to have presented Charlemagne, an elaborate device, which resembled the water clock. Around 1000 AD, clocks were introduced in Western Europe by Pope Sylvester II. Detailed descriptions of the clocks were constructed by Richard of Wallingford some time before his death in 1336 and those by Giovanni de Dondi in Padua in 1348–64 still exist today.
Horology is a term used to indicate the study of time. The Galilei curiosity The concept of a pendulum was first conceived by Galileo Galilei, the same man to whom we owe the telescope. The idea occurred to him purely by chance. When Galileo was a student in Pisa (the place which has the leaning tower!), in Italy, he observed that the lamps hung in the cathedral swung to and fro when the wind was strong. Since the watch was yet to be invented, Galileo used the pulse in his wrist to measure the time it took for the lamp to swing to and from a perpendicular line, which was the rope it hung from. He observed that the size of the object did not matter for the time taken to swing. This law came to be known as the ‘isochronism of the pendulum’ or the pendulum law.
Other units for measuring the passage of time are: day, month, year, etc.
We measure these using a calendar. Owning time In the thirteenth and fourteenth century, clocks were beginning to gain importance in Europe. However, they were only installed in important places, as installing clocks was an expensive affair. The three surviving earliest clocks were installed in the cathedral in Salisbury (1386), above a bridge in Rouen (1389) and in Wells (1392). In the fifteenth century, smaller versions of these cathedral clocks were installed in palaces by the nobility.
In 1094 AD, Su Song completed the first water-clock tower in China. In 1656, Huygens invented the pendulum clock using Galileo’s law of the pendulum or isochronism of the pendulum. The pendulum clock became the most accurate instrument of keeping time for almost three centuries until the discovery of piezoelectricity. In 1675, Huygens developed a device called the hairspring, which is a spring that controls the speed of oscillation. This made it possible to make a clock of a diminished size. This led to the introduction of the pocket watch. In the seventeenth and eighteenth century, the trade of manufacturing clocks was dominated by the English. Their main target was the English elite class. The lower sections of the class still depended on natural indicators of time, like the sun, the moon and the position of the stars. In the nineteenth century, clockmaker Eli Terry from Connecticut and some others managed to mass-produce clocks. This enabled the public to acquire cheaper clocks and interchangeable parts. Checking time was no longer expensive!
Walter G. Cady invented the quartz oscillator. Another breakthrough in horology came from Jacques and Pierre Curie in 1880 when they discovered the piezoelectric properties of quartz. Piezoelectricity is a term used for electrical charge that is built up in solid materials such as crystals (e.g., quartz) when mechanical stress is applied. What is quartz? Quartz is a mineral compound, which is made up of silicon (Si) and oxygen (O2). In a quartz clock, what is the electronic oscillator, which creates an oscillating electronic signal, made of? Quartz, of course! Quartz clocks became the most popular and most accurate timepieces until the introduction of the atomic clocks.
Does the name Curie sound familiar? It should. One of the two brothers, Pierre is the husband of the famous physicist, Marie Curie! The first accurate atomic clock was built by Louis Essen in 1955 and remains to be the most accurate clock till date. There are different types of clocks in existence. One example is, the analogue clock, which has a short hand pointing at the hour and a longer hand indicating the minute. The most commonly used clock now, is the digital clock, which is on smartphones and any electronic devices. Auditory clocks are convenient where the timepiece is not visible or it can be programmed to tell a specific time such as an alarm clock. It is very useful for the visually impaired. It is difficult to imagine what our daily schedule would be like if there
were no clocks. Also, it will be a mammoth task to even set up meetings or appointments with people who belong to different time zones. Luckily for us, that need not be the case as we can plan our daily schedule around the twenty-four hours on our watch.
Which time zone do you live in? Can you identify the difference between your time zone, and say, London or Cuba or Japan?
If you look at any Coca-Cola plastic bottle, you will find ‘since 1886’ written under the label. The year marks the invention of this world-famous beverage by John Pemberton of Knoxville, Georgia. The man behind the invention Dr John Pemberton (1831–88) was a physician and pharmacist. He served in the Confederate Army during the American Civil War (1861–65). He suffered an injury to the chest at the Battle of Columbus in 1865, which continued to cause him constant pain. In order to ease the pain, he used morphine, which was the most common painkiller in the nineteenth century. He soon became addicted to the substance. He started searching for a
morphine-free substitute to ease his pain. The concoction that Pemberton extracted from his experiments was sold as ‘Pemberton’s French Wine Coca’, a medicine to counter depression and post-traumatic stress among war veterans. Coca-Cola The enactment of the prohibition legislation made it necessary for Pemberton to find a non-alcoholic alternative. He sought the help of Willis E. Venable. Venable owned a drugstore in Atlanta and the two began working on finding the right recipe. In 1886, while they were working on their concoctions, Pemberton accidentally mixed the base syrup he used in his earlier ‘Pemberton’s French Wine Cola’ with soda water. This resulted in the mixture sizzling and frothing bubbles. This later came to be known as CocaCola.
Components: ‘Coca’ is a plant native to South America and India. ‘Kola nut’ is a fruit found in the rainforests of Africa. It contains caffeine. ‘Damiana’ is a flowering shrub in southern Texas and South America. The new invention was a success! Unfortunately, for Dr Pemberton, he could not have foreseen the commercial giant his invention would become. Ill health and financial struggles drove him to sell the rights to his secret formula. Although he tried to retain part of the patent for his son, the latter decided to cash in and the patent was sold to Asa Griggs Candler in 1888. Pemberton’s son Charley Pemberton retained the ‘Coca-Cola’ trademark. After the sudden death of Pemberton on 16 April 1888, Candler sought to obtain sole rights to the drink after forcing coowners out of business, and on 30 August 1888, according to his son Charles
Howard Candler, Asa Griggs Candler became the sole owner of Coca-Cola. Charley Pemberton died from opium overdose on 23 June 1894, bringing an end to the Pemberton legacy.
In 1886, Atlanta, in Georgia, adopted a law, which banned the purchase, manufacture and sale of alcoholic beverages. This is the prohibition law. The Coca-Cola bottle The company faced the problem of bottling because carbonated drinks needed to be packaged in an airtight bottle or the fizz would escape. In 1899, Candler signed a contract with three other businessmen to produce and sell the drinks. They developed a bottling system, which would be used in all the 1,000 bottling plants that existed during the time. The distinct shape of the Coca-Cola bottle was first launched in 1916 and later trademarked in 1977. This made it easy to distinguish the genuine beverage from those of imitators.
Frank Mason Robinson, an advertising agent and marketer, working with Pemberton, coined the term ‘Coca-Cola’ by combining the two main components. Coca-Cola campaign
The first campaign strategy for Coca-Cola was through coupons for free samples and advertisement in newspapers in 1887. Almost a century later, catchy lines were attached to the drinks emphasising the element of fun and companionship. During the 1928 Olympic Games in Amsterdam, Coca-Cola set up kiosks around the Olympic venues to supply the drink to the Olympic crowd.
Civil rights is a person’s right to freedom, which protects him from discrimination on grounds of race, nationality, gender, religion, etc. In 1945, the Coca-Cola Company registered the trademark name ‘coke’, and five years later, Coca-Cola becomes the first product to appear on the cover of Time magazine. With a view to address the civil rights issue, the advertising began to include African-Americans, including the Olympian Alice Coachman and Jesse Owens, the African-American Football stars, etc., in the 1950s. For the first time in its nine years of existence, the formula for the drink was changed in 1985 with the name ‘New Coke’. However, the product returned to its original formula after seventy-nine days with another name, ‘Coca-Cola Classic’. People boycotted and protested against the changed formula. Old is gold indeed! In the 2000s, the Coca-Cola Company sponsored several major global events, including the Olympic Games in Sydney in 2000, the fight against AIDS in 2001, the FIFA World Cup in Japan and Korea, the American Idol, Beijing Olympic Games in 2008, etc. Till date, it is consumed as much as ever in its 133 years of existence.
The Coca-Cola Foundation has been helping communities improve the global society since the company established it in 1984.
In 1925, the board of directors of the company passed a resolution to put the secret formula for Coca-Cola in the vault of Atlanta’s Trust Company Bank.
How do you remember your way from the local park to your house? Is it
because you have a tiny compass inside you, which gives you the directions? Or because you remember the familiar route? In this era of smartphones, we have GPS and Google Maps, but what about many centuries ago? What do you think ancient travellers used for navigation? It was said that Vikings would carry caged birds, and they would release one bird at a time. This was done with a unique idea that if the birds could not spot land nearby, they would return to the ship. However, if they did not return, the sailors would assume that there was land nearby. But birds can be very unreliable sometimes! Made in China The Chinese saved the day with the invention of the compass. The compass made navigation much easier for more than a thousand years. Christopher
Columbus, Vasco da Gama, Ferdinand Magellan and many others would have been lost at sea without this important invention! Chinese literature, dating back to the fourth century, has accounts where a compass made of lodestone as a navigation device is mentioned. The Emperor Wang Mang of China in the first century was said to have used a lodestone so that he could always sit facing south. The compass was first used as a device in fortune telling! It was only later that it was used as a navigational device. In the twelfth century, Alexander Neckam also describes how sailors used the compass for navigation by touching a magnet with a needle whose point would face north. Magnetic flaw In 1707, during the war of the Spanish Succession, the British Royal Navy fleet lost four warships and around 2,000 men off the Isles of Scilly. One of the reasons attributed to the tragedy was the flaw in the compasses on board. Moreover, as a great quantity of iron was used in the construction of ships, it disrupted the compass.
Lodestone is a mineral that has natural magnetic properties and can attract iron. The discovery of this mineral also made it possible to discover the process of magnetism. Electromagnetism In 1820, Hans Christian Ørsted (1777–1851), a Danish scientist, accidentally came across the relationship between magnetism and electricity. He was preparing for a lecture when he unknowingly brought a compass close to an electrical wire. He observed that the needle of the compass jumped and
pointed in the direction of the wire. On investigating further, Ørsted ended up developing the phenomenon of electromagnetism. Electromagnetism is the phenomenon in which an electric field interacts with a magnetic field. Ask your physics teacher to help you construct an Ørsted compass using his method.
The Spanish War of Succession was triggered by the death of Charles II, who died without a legitimate heir. Types of compasses Liquid Compass is said to have been introduced by Sir Edmund Hillary in 1690 at a meeting of the Royal Society. In 1813, Francis Crow patented the first practical liquid compass for use in small boats and mainly for maritime use. The Bearing Compass was the most popular type of compass used around the eighteenth century. In 1928, a Swedish instrument designer, Gunnar Tillander invented a new bearing compass, which was improved and modified, with its variations still in use today. In the twenty-first century, we hardly use a compass for navigation. A navigation device can easily be installed on our smartphones. The invention of the compass is what led to the development of hi-tech navigation devices. The humble compass has come a long way and has made major contributions to mankind.
Ask if your grandparents have those old compasses lying around the house, and if they do, why not ask them to teach you how to use them?
What’s your favourite breakfast? Breakfast is such an important part of our
daily routine. Yet, it is a boring task to prepare meals as soon as we get up in the morning! That’s why cereals are the most common part of the modernday breakfast, as they come in processed, ready-to-eat forms. Possibly, the most popular among these cereal breakfast options is cornflakes. It is a favourite among children. Parents, who struggle to feed morsels of food to their toddlers, would testify to how cornflakes saved their mornings. Today, the most popular brand of cornflakes the world over is Kellogg’s, which was invented by John Harvey Kellogg in 1894. The Kellogg Saga John Harvey Kellogg (1852–1943) was an American doctor, author and a nutritionist. From the humble beginnings of his family’s broom business, he rose to the position of director of Battle Creek Sanitarium until it was sold in the era of the Great Depression in the 1930s. The Sanitarium, or The San in
short, was intended to promote the Clean Living Movement, as part of a progressive health-reform movement that swept North America in the nineteenth century. The movement advocated abstinence from alcohol, tobacco, sugar and other foods and practices, which were considered harmful for the human body. As director, John Kellogg developed the diet for the Sanitarium inmates. In 1894, a routine preparation of food led to the accidental invention of a breakfast cereal that has come to represent breakfast itself today.
The Great Depression was the collapse in the economy, characterized by unemployment, decline in purchase, debts, etc., which began in the USA in 1929 and spread to several countries in the 1930s. Absent-minded genius There are varying accounts of the manner of its invention, but the most accepted version goes like this. John and his younger brother, William Keith Kellogg, were preparing food for the patients at the Sanitarium, when they accidentally left a batch of wheat-dough overnight. In the morning, rather than discarding the dough and wasting food, the brothers decided to roll it out. Instead of the usual flat sheet, the dough broke into thin, crunchy flakes. The first cornflakes were born! John then asked his brother, William, to retrace the process they followed in coming up with the new recipe. William was successful in his mission and even started a small business from the Sanitarium kitchen—supplying cornflakes. His customers were mostly former inmates who loved the crunchy cereal flakes. They couldn’t get enough of it even after leaving the
place! John Kellogg was granted a patent on 14 April 1896 for ‘Flaked Cereals and Process of Preparing Same’. In 1898, the first commercial batch of cornflakes was sold as ‘Sanitas Toasted Corn Flakes’.
The San became a reputed wellness centre. Notable personalities like Amelia Earhart, George Bernard Shaw, Thomas Edison, Henry Ford, etc., came to The San, seeking to rejuvenate their health under Kellogg’s care. War of cornflakes With success came a greater test. The brothers had differing opinions regarding their product. William Kellogg wanted to add more flavours and started experimenting with sugar. Sugar was strictly forbidden at The San, as John considered it one of the foods harmful for the body. This created a rift between the brothers. William Kellogg left the Sanitarium to start Battle Creek Toasted Corn Flake Company, in 1906. In 1910, he filed a lawsuit over the use of the name Kellogg by John. He claimed it was misleading and confusing for his customers. The court ruled in his favour and John was restricted from using the Kellogg name. These bitter legal battles led to increasing resentments and further intensified the rift between the two brothers. This would remain the dark cloud in their success story. The real Kellogg William Kellogg wanted to expand his business. He devised a plan to market his product on a larger scale. He came up with a very clever idea. He decided to give away free samples to housewives so that they could request their local grocers to stock his product in their stores. He also launched the ‘Wednesday is “Wink Day”’ advertisement’ in New York. Fee samples were given to anyone who winked at the grocers on Wednesday. This was a daring move
and it paid off. Sales in New York skyrocketed beyond his expectations. He also invested in a newspaper advertisement, pointing out the trademark. ‘W.K. Kellogg’ was printed on the covers to distinguish his products from those of his imitators. The name was later changed to the Kellogg Company and is still the leading manufacturer of cornflakes today. He went on to make a fortune in his Kellogg Company. Beyond cornflakes Apart from inventing cornflakes and changing the menu of breakfast, John Kellogg was one of the most influential physicians of his time. He was credited with many inventions. He also held several patents for contraptions used in physiotherapy, electrotherapy, hydrotherapy, phototherapy, etc., some of which were installed in the Titanic’s first-class gymnasium. He was a supporter of the ban on alcohol and tobacco in the mid-twentieth century and campaigned against consumption of coffee and tea because of their caffeine content. He was a champion of the ‘clean movement’ and was a major influence in the passing of the Volstead Act on 17 January 1920. Also known as the Prohibition, it was a constitutional ban on alcoholic beverages that lasted for thirteen years. After the Sanitarium was sold, Kellogg moved to Florida at the age of seventy-eight and opened the Miami-Battle Creek Sanitarium, which was quite successful. Although the Kelloggs did not have any biological children, they opened the Haskell Home for Orphans in 1894 and adopted some of them. In a heart-warming effort to end the feud with his brother, John drafted a letter wherein he apologized for downplaying the contribution of his brother in the invention of cornflakes and sought to make amends. However, his secretary did not send the letter, as she felt that her employer was demeaning himself, and William Kellogg saw the letter after his brother died on the 14 December 1843. The brothers were never reconciled during their lifetime.
The Rooster mascot was used because Kellogg’s friend pointed out that the Welsh word ‘ceiliog’, which means rooster, sounds the same as ‘Kellogg’.
12 Disposable Diaper
What comes to your mind when you think of a baby? Feeding bottles? Toys? Diapers? We often associate these things with babies because they are essentials for a baby today. They are also less time-consuming and more convenient for mothers. When was the first diaper invented? What did people use before then? Let’s try and find out more. Shitty business Long ago, babies were covered with leaves. Eskimo mothers placed moss covered with seal skin. Inca mothers used packed grass and covered it with rabbit skin. In Elizabethan times, several days would pass before the diapers were changed! In some cultures, the concept of diapers did not exist at all, as the babies did not wear any clothes. By the twentieth century, cloth diapers were most commonly used. However, cloth diapers required frequent washes, collected stains and were not very hygienic. Mothers would often make plastic coverings for the cloth diapers so that urine or stool did not leak
through. They were practical enough, but the problem of diaper cloths needing constant scrubbing and washing required to be solved.
Diaper originally meant a small pad with repeating geometric shapes. It later evolved to describe white cotton or linen with geometric patterns. In Britain, it is referred to as ‘nappy’. The Boater The first absorbent pad used as a diaper was invented from unbleached creped cellulose tissue by Paulistróm in Sweden in 1942. However, it was Marion Donovan (1917–98) in 1946, who invented something that changed the lives of mothers. The disposable diaper! A mother of two young children, Marion found herself constantly washing and cleaning soiled clothes. These were the only means to keep her children’s bottoms clean. Exhausted and frustrated, she decided to find a solution to her problem. She took down the shower curtain, which was made of waterproof plastic. She then sewed the curtain together with the baby’s cloth diaper. To her relief, she found that the plastic cover prevented the baby’s cloth from being soiled. She named it ‘boater’ because it looked like a boat to her. Marian Donovan had invented the first leak-proof, disposable diaper!
The Hellermans owned Dy Dee Wash, which was a laundry service for washing baby diapers. Dirty cloth diapers were washed and sterilized and
sent back to the owners. At the peak of business, around 50,000 diapers were washed in a day! Rise of the inventors Marion’s invention made it possible to explore ways to make their job easier. In 1950, Lorraine Hellerman invented a pre-folded diaper. Extra layers of cloth were placed in the middle and the fold was sewn shut to prevent the cloth from moving around. In the same year, Sybil Geeslin invented the first strap-on diaper, which did not require a pin to hold the ends together. It was called the ‘Safe-T Di-Dee’. The second half of the twentieth century witnessed a boom in the diaper business as companies like Motherease (1995), Born to Love (1997) and Poochies and HoneyBoy (1999). These were three of the biggest names whose diapers could be sewn by mothers as a DIY (Do It Yourself).
WAHM means Work at Home Moms. It is an online site where mothers working from home could design products for their own use or sell them. Twenty-first-century diapers Considering that babies have been around for as long as humans have, the diaper solution was addressed only in the twentieth century. However, once the wheels were set in motion, advancement in diaper technology was rapid. In 2000, Fuzzi Bunz opened their first online store, and the Stacinator fleece diaper was introduced. In 2003, WAHM opened WAHM Boutique and Tuesday Bear, which established itself as one of the biggest diaper companies. In 2002 and 2003, big diaper companies began suing the mushrooming smaller companies, alleging infringement of product design and other patents held by the larger companies. This led to the closing down
of WAHM diaper makers. They could not keep up with the cutthroat diaper competition. The return of cloth diapers In the twenty-first century, Jennifer Labit, reintroduced the use of cloth diapers. After losing her job, she could no longer afford to buy diapers and had to start using cloth diapers. What began as a necessity grew into a concept and her company, Cotton Babies, is worth millions today. More and more mothers were opting for a cloth-based diaper because of concerns regarding safety and hygiene. They would buy the cloth diapers in bulk. They cut the cost by redistributing them among their groups, since they are guaranteed to need diapers until the baby is toilet-trained. As more innovative ideas are conceived to suit the needs of babies and their caregivers, it looks like the diaper business is going to be around for a long time!
Cotton Babies provides day-care, allowing parents to bring their child to the office till they are one year old.
Remember, where there are problems, there are also solutions waiting to be found!
Most inventions are designed to benefit mankind, but there are some whose destructive characteristic outweigh their usefulness. An example is the atomic bomb. It constantly threatens global peace and our existence, as countries that possess it are constantly on the brink of war. Do you think the inventors regret inventing them? Would they finish their work if they knew the consequences? Many inventions benefit mankind while some, despite the intent of the inventor, are destructive. One such inventor was Alfred Nobel. Do you want to know what he invented? Read on. Mine your business In the late nineteenth century, the explosives used for demolition, mining,
construction-blasting and other purposes were made using gunpowder and nitroglycerine. As it is highly unstable, nitro-glycerine can be easily triggered by temperature, friction and sudden movements. This causes it to explode while handling and transportation. This resulted in many cases of accidental explosions and casualties as well as deaths. One such tragic victim was Emil Nobel, brother of the inventor Alfred Nobel. Emil Nobel died in a nitroglycerine explosion at a factory in 1864. A year later, Alfred Nobel purchased land near Krümmel, in Germany. He constructed several buildings to manufacture nitro-glycerine on a large scale. These buildings were separated by large earth embankments with a laboratory on a small boat anchored in the nearby river, Elbe. Nobel set about manufacturing nitroglycerine until an accident in the laboratory changed his life.
Nitro-glycerine is a mixture of nitric acid and glycerol. It is oily, colourless, highly unstable and volatile. It is more powerful than gunpowder! Oops! I dropped it again! On 12 July 1866, Nobel was working in his laboratory when he accidentally dropped a vial of nitro-glycerine on the floor! He was an expert manufacturer and knew the liquid was highly unstable. He knew it would explode and cause him severe injuries or even kill him. Everything happened in a matter of a few seconds so there was no time to escape. Nobel braced himself for the explosion. Nothing happened.
Dynamite is derived from the Greek word ‘dunamis’, meaning power. Several seconds passed and yet, he didn’t feel the pain he knew was coming. He opened his eyes and was astonished at what he saw. The liquid did not explode! This was a miracle, and he had to find out why it happened. He immediately studied the spilled contents. After careful examination, he found that the pile of sawdust on the floor had absorbed the liquid. The sawdust was mixed with dirt from the embankments. The embankments between the buildings were made up of diatomaceous earth. He had stumbled upon the knowledge that nitroglycerine can be stabilized by mixing with diatomaceous earth! Nobel performed several experiments to confirm his hypothesis, until he was convinced of the stabilizing effect of diatomaceous earth on nitro-glycerine. He immediately filed a patent to his invention in 1867 and called it ‘dynamite’. He founded the Nobel Dynamite Trust Company (Dyno Nobel) and licenced his patents to manufacturing companies in several countries in return for interests in the companies.
Diatomaceous earth is formed from the fossil of microscopic single-celled marine algae. To err is human Dynamite proved to be a powerful weapon. It could now be easily massproduced and transported, but it became more accessible and destructive in the wrong hands. It was used widely in warfare. People were angry at him for inventing such a destructive weapon. When his brother Ludvig Nobel died, a journalist thought it was Alfred and wrote his obituary. It described him as a ‘merchant of death’, responsible for ‘finding ways to kill more people faster than ever before’. Nobel was deeply affected on reading the obituary, and he wanted to leave a better legacy.
The atonement By the time of his death in 1896, his company had grown to 350 patents and ninety factories in twenty countries. In his will, Nobel left most of his wealth to the establishment of the Nobel Prize, named after him. He expressed his remorse over the destructive power of his invention. As perhaps atonement for developing a potentially destructive weapon, Nobel felt he must leave the world a better place than what he had found it as.
Notable Nobel laureates include Mahatma Gandhi, Malala Yousafzai, Mother Teresa, Rabindranath Tagore, Marie Curie, Barack Obama, etc. Hence, the Nobel Prize was first awarded in 1901 in Stockholm and continues to be given for achievements in various fields like Physics, Chemistry, Medicine, Literature and Peace. It is considered to be the highest civilian award and aims at fostering constructive advancement of the human race. Noble Nobel Alfred Nobel made his fortune by accidentally inventing dynamite. However, that is not what he is remembered for. The Nobel Prize has become the more enduring of his two legacies. He is a representation of the pros and cons of any invention. Man can either become a victim or a beneficiary and progress through it. As both creator and beneficiary, it is the moral responsibility of each and every individual to use these inventions judiciously!
As Nobel did with his Nobel Prize, it is never too late to try and fix problems we create!
14 EnChroma Glasses
If you look around, you will see objects of different colours—green, blue,
red and yellow—around you. How are you able to identify these colours? Did you know that some people cannot differentiate between colours? They are said to suffer from colour blindness. What is colour blindness? Colour blindness is an incurable genetic condition caused by an abnormality in the cone cells present in the retina of the eye. These cone cells respond to the different wavelengths of visible light (VIBGYOR—Violet, Indigo, Blue, Green, Yellow, Orange, Red), which is responsible for distinguishing colour. It can also be caused by injury to the eye or the part of the brain where the optic nerve is located. The condition can be diagnosed by performing the Ishihara Test and other genetic tests. Colour-blind people have difficulty in performing tasks that require them to distinguish colour, such as piloting, driving, war, etc. There are two types of colour blindness. In the first type, the person has difficulty distinguishing between red and green. In the second type, blue and yellow are difficult to distinguish.
Ishihara Test is named after the person who designed it—Shinobu Ishihara. EnChroma lenses In 2002, Donald McPherson, a glass scientist, accidentally invented the EnChroma. He was working on a special kind of eyeglass for surgeons, which would protect their eyes during laser surgery. The glasses he designed contained rare earth, which was able to absorb light and distinguish between blood and tissue during the surgery. This was crucial for surgeons. Some of the glasses even went missing from the operating room. Seems like they were a big hit! McPherson was proud of his invention, and he started wearing them as sunglasses. Eureka! One day, McPherson went to watch a game with his friend. He was wearing the glasses he had designed. He lent it to a friend standing nearby. This particular friend happened to be colour-blind. On wearing McPherson’s glasses, the friend was amazed to see all the green-coloured grass and the orange line-cone on the field! McPherson had accidentally invented glasses that could help colour-blinds see colours! He decided to work on his invention and develop it to be perfectly suitable for people with the condition. He started a company named EnChroma, along with Tony Dykes and Andrew Schmeder. They used trendy frames similar to sunglasses and inserted EnChroma lenses and sold them as sunglasses. The glasses were comfortable enough to be worn daily by people with normal vision. They seemed casual enough to be worn by colour-blind people and yet they could appear to be wearing regular sunglasses.
Colour blindness is more common in males than females. A paint company used EnChroma glasses in an advertisement campaign. Colour-blind people were made to wear the special glasses, and their videos were captured where they could see colours for the very first time. The glasses, like any other, were not meant to cure the condition but provide assistance. When asked about his invention, McPherson says, ‘It still gives me goosebumps when someone bends down, and sees a flower and asks, “Is that lavender?”’ Origin of sunglasses The early sunglasses were invented around the twelfth century, and they were made of slabs of quartz (mineral composed of silicon and oxygen) to block the light from the sun. However, they were not used as sunglasses then. Who do you think invented them? Yep! The Chinese! It was popular among the Chinese judges so that their eyes could be concealed. This was done because human emotions were very easily visible in the eyes and this way, the judges could appear unbiased when they dealt with people! In the fifteenth century, eyeglasses that were believed to correct vision were introduced in Europe from China through the Italians. Sam Foster invented the modern sunglasses in 1929. The reputed Ray-Ban launched its anti-glare Aviator glasses in 1937, which is popular to this day. So, next time you go shopping for a pair, do not be scared to choose something that is to your taste. As for the trend, whichever pair you buy is sure to make a comeback sooner or later.
Polaroid filters were invented by Edwin H. Land in 1836. These polaroid filters were able to protect the eyes from harmful rays like UV rays, and they were no longer just about fashion but health as well.
If you know anyone who is colour-blind, you can now tell them the good news. They will be able to see colours using EnChroma glasses!
Gunpowder is an explosive made up of three main components: charcoal
(mostly made of pure carbon) sulphur and potassium nitrate (saltpetre). The mixture is highly explosive because it burns rapidly. A large volume of hot air expands so rapidly that it explodes, producing energy and sound. The quest for eternity There is no definite record of who invented gunpowder, but historians widely believe that alchemists, in their quest for eternal life, accidentally invented it in China. This is because the historical documents discovered mentions a form of gunpowder. There is record of an alchemist named Wei Boyang who describes the effect of a mixture of three powders, believed to be the three main components of gunpowder. Emperor Wu Di of the Han Dynasty was said to have funded research carried out by Taoist alchemists to seek the elixir of life. Its initial was very
different from what it came to be used for. The Chinese name for gunpowder is ‘huoyao’, which means ‘fire medicine’. There is no record of it being used as a weapon until much later. Around 700 AD, it was used in fireworks by the Táng dynasty. It was only after 900 AD that it came to be used as a weapon. As with their other inventions, the Chinese kept the ingredients of gunpowder a secret for centuries. By the twelfth century, it spread to Eastern Europe. In 1774, Louis XVI ascended the throne of France. On realizing that France depended on Great Britain for the key ingredient, saltpetre, he appointed Antoine Lavoisier as the head of the Gunpowder Administration. Within a year, France had a surplus to export to countries such as the Americas, where the American Revolution was brewing. In India, gunpowder was believed to be introduced by the Mongols during their invasion.
Alchemy: the process of turning base metals into gold Elixir: potion believed to prolong life Remember, remember, the 5th of November! In the Great Britain, a historical moment was created by Guy Fawkes in the famous Gunpowder Plot. Fawkes was captured on 5 November 1605, hiding under the House of Lords, with barrels of gunpowder. This incident is dramatized in a 2005 Hollywood movie, V for Vendetta. The gunpowder irony Technological advancement contributed to more innovative and safer ways of producing gunpowder and preserving them. Gunpowder was also known as ‘black powder’ in nineteenth-century United States. This was to distinguish gunpowder from the new powders that were now available in the market.
Depending on the proportion of the mixture, gunpowder consisted of different types of powders like brown gunpowder and cocoa powder.
The earliest account of gunpowder in Europe was by Roger Bacon in his Opus Maius written in 1267.
Isn’t it ironic that gunpowder, with all its potential for causing death, was accidentally invented in an attempt to prolong life?
16 Hypodermic Syringe
A hypodermic syringe is one of the most mass-produced medical devices.
Hundreds are used in hospitals Syringein a single day. They have also saved countless lives than any medical procedure performed. The earliest experiment with intravenous injection was done by Christopher Wren in 1656. He used a goose quill and animal bladder to administer opium to dogs. The experiment was more or less successful. The syringe with a hollow metal was invented almost two centuries later by Francis Rynd in 1844. The man of the hour
Francis Rynd (1801–61) was a surgeon at Dublin’s Mental Hospital. He had been treating a patient suffering from neuralgia—a painful condition caused by damaged nerves. The patient was in so much pain that she even tried to drink morphine. It did not ease the pain as the morphine was consumed orally. At this point, Rynd came up with an innovative solution. He decided to inject the morphine directly under her skin into the bloodstream. On 3 June 1844, Rynd designed the first hypodermic syringe. He injected the patient using an anaesthetic. It temporarily eased her pain as the morphine entered directly into her bloodstream and numbed her pain. The devices that could strictly be termed as hypodermic syringes were first invented independently almost at the same time by Alexander Wood and Charles Gabriel Pravaz in 1853. In 1897, a pharmaceutical firm, BD, formed by Maxwell W. Becton and Fairleigh S. Dickinson, acquired the half interest for a patent for an allglass syringe owned by H. Wulfing Luer and imported them from Paris. However, they began manufacturing the syringes in the United States. With the discovery of insulin in 1921, which had to be injected directly in the bloodstream, there was a renewed interest in the production of syringes. In 1925, they launched the Yale-Luer-Lok syringe, designed by Dickinson, which could safely attach and detach a needle from the syringe, and are the standard connectors used to this day. Joining the stalwarts of male inventors is Letitia Mumford, who patented a one-handed syringe in 1899. Unlike the other syringes, which required two people to administer, Letitia’s syringe could be handled with only one hand. Use and throw Arthur E. Smith patented the first disposable syringes made of glass in 1949. Glass syringes were difficult to handle, often fragile and prone to become brittle with continuous use. At the same time, producing disposable glass syringes were expensive. In 1955, Roehr Products developed the first plastic disposable syringe. However, the medical industry was not keen on disposing a perfectly usable glass syringe, which could be sterilized and reused. In 1956, a New Zealander, Colin Murdoch, designed the plastic syringe whose design is in use to this day. The world of hypodermic syringe was dominated by BD. In 1961, they introduced the ‘Plastipak’, which brought disposable plastic needles to the mainstream.
Needleless Nowadays micro-needles make the procedure more painless. This device was invented by Mark Prausnitz and Mark Allen, using 400 silicon-based microscopic needles, about the width of a human hair. These needles are advantageous because the size of the needle is too small and does not cause much pain during injection. Another option is the ‘hypospray’. It sprays powdered medicine onto the skin without the need of using needles. Hypodermic needles are also used by dentists, nurses and physicians. In some cases, the patients perform the injection on themselves. In the case of diabetic people, the patients have to remain alert and monitor their sugar levels constantly. Doctors and nurses often use soothing techniques to reduce the stress of patients who are scared of needles. There are several precautions to be taken in the course of handling a hypodermic syringe. A single prick could be fatal when handling cases with highly communicable diseases.
Hypodermic needles are made by a process called tube-drawing, where a stainless steel tube is repeatedly drawn using a specially-designed die or drawing tool until the desired size of the needle is obtained.
The fear of needles is known as Trypanophobia!
17 Ice Cream Cone
How would summer vacations be if there weren’t yummy ice creams to lick
and munch on? No vacation at all. The challenge with eating ice cream is a race against time and temperature—to slurp it all up before it melts in your hands. When you are done with the scoop(s), you eat the wafer cone, with some cream lodged inside. A real summer treat! Cornucopia The term ‘cornucopia’ comes from the Latin words, ‘cornu copiae’, which means ‘horn of plenty’. The shape of the horn is an inspiration for the conical shape of the ice cream cone. There were reports from travellers to Düsseldorf,
Germany, who claimed to have eaten ice cream with edible cones. In Charles Elm Francatelli’s cookbook, The Modern Cook, published in 1846, he mentions cornets filled with ice cream. According to the historian Anne Funderburg, as with most popular inventions, there are several origin stories and people claiming the invention. This includes Nick and Albert Kabbaz of Syria, Frank and Charles Menches of Ohio and Abe Doumar of Lebanon. It is impossible to know for sure who is the true inventor, and therefore, we shall discuss only the two most credible stories. Story No. 1 Ernest Hamwi was believed to have attended the 1904 Louisiana Purchase Exposition or World’s Fair. Hamwi and his wife opened a ‘zalabia’ booth. Zalabia is a traditional Lebanese dough waffle baked between two hot iron plates. The Hamwi’s booth was stationed next to many ice cream booths. The vendor next to the Hamwis was selling his ice cream so fast that he ran out of clean cups to serve them. In those days, ice cream was served in a bowl or glass cup. According to Hamwi’s letter to the Ice Cream Trade Journal in 1928, he saw an opportunity presented before him. He quickly rolled a zalabia in the shape of a cone and offered it to the ice-cream vendor. The idea spread rapidly and people began adopting the edible cone-holder for their ice cream. Story No. 2 This one involves zalabia, too! Abe Doumar migrated to the United States from Damascus in 1895. In the 1904 World’s Fair, he sold souvenirs during the day and joined others selling zalabia at night. It was on one such night that he took a zalabia and rolled it the way flatbread was rolled in Syria. Instead of stuffing the cone with meat, he used ice cream and was able to eat it without a spoon for the first time. According to his nephew Albert Doumar, Abe shared his new idea with the other zalabia vendors there. Abe believed that Ernest Hamwi must have been one of the many zalabia vendors who used his idea, as he was also present at the 1904 Exposition. When the Exposition ended, Doumar developed a machine for making zalabia, which he then used to roll out
cones. In 1905, he opened an ice cream stand at Coney Island. As his business prospered, he brought the other members of his family, including his parents and brothers from Syria, and they resided in Norfolk and established the Doumar’s Cones and Barbecue, which run to this day. Despite the many origin stories we have just read, we can never know for certain who invented the ice cream cone. However, it is safe to say that it was invented through ingenuity. An ice cream cone may seem unimportant when compared with other inventions, but we cannot imagine eating our ice cream any other way, can we?
Alfred L. Cralle invented the ice cream scoop called ‘Ice Cream Mold and Disher’. It serves as a model for the ice cream scoops used today.
The heart is one of the most vital organs in the human body. It pumps
blood, which is circulated through blood vessels, carrying oxygen to different parts of the body. The importance of its function in the circulation of blood throughout the body was scientifically explained by William Harvey in the seventeenth century. The Sino-atrial Node is a specialized nerve cell on top of the right atrium. It sends electrical impulses. These impulses/signals regulate the systolic and diastolic motions of the heart. The normal blood pressure is 120/80 mm hg (millimetres of mercury), where 120 is the systolic pressure and 80 is the diastolic pressure.
Systole is when the heart contracts to pump blood to the other parts of the body through the arteries. Diastole is when the heart expands after systole, and the veins carry blood into the heart. However, in certain cases, the heart does not receive enough impulse to regulate the heartbeat due to several reasons such as blockages in the pathways. This results in arrhythmia, or irregular heartbeats, which could lead to a cardiac arrest. Therefore, an artificial device, which could regulate the electrical impulse is necessary to stabilize the heart and prevent damage or death. Story of its Origin In 1958, the first wearable pacemaker was developed by Earl Bakken (1924– 2018). However, it was Wilson Greatbatch (1919–2011) who invented the first prototype of the implantable pacemaker. Born in Buffalo, New York, Greatbatch was naturally curious. He learned to play the harmonica, his favourite musical instrument, on his own. The 1920s was a time when the radio was making waves, and Greatbatch was fascinated by it. He built his own short-wave radio equipment. His knowledge and improvising nature became a great asset during the Second World War. After the war, he enrolled in Cornell University and went on to join the Psychology Department’s animal behaviour farm. It was here, through a chance encounter with visiting doctors from New England, that he first learned of the complications caused by irregular heartbeats, which could even result in death.
The human heart has four chambers—the right atrium, the left atrium, the right ventricle and the left ventricle.
Fixing a broken heart! By 1956, Greatbatch was teaching electrical engineering at Buffalo University while working at the Chronic Disease Research Institute. He was designing circuits to record heart sounds. One day, while working in the laboratory, he accidentally grabbed the wrong resistor and plugged it into the circuit he was working on. The circuit began to beat with a systematic rhythm similar to heartbeats! He immediately realized that he had stumbled upon an invention that could solve the problem of the heart. He contacted William C. Chardack, at Buffalo’s Veteran Administration Hospital, and convinced him to perform an experiment using his device. Chardack agreed, and on 7 May 1958, Greatbatch brought his device.
The pacemaker was the first-ever electronic device to be implanted in the human body. Ake Senning of Sweden had already successfully conducted the first human implant in 1958. Chardack and surgeon Andrew Gage tested the device by attaching the heart of a dog to the two wires of the device. To their astonishment, the device had started regulating the pace of the animal’s heartbeat! However, the experiment was not without its setbacks, as they realized that body fluid would fill the space in the device causing short-circuit. They had to come up with a better device to withstand being implanted inside a living body. In the 1950s, there was competition from other groups who were also trying to fix the same problem. Although the device worked only for a mere three hours, it was proof that the idea worked. Greatbatch was under immense pressure to upgrade his device. He even gave up all his other jobs and started a workshop in his barn and constantly worked on the device. He managed to build around fifty
pacemakers. William C. Chardack and his team had implanted pacemakers on ten patients by 1960, with moderate success, and one patient even lived for thirty years after the implant. The pacemaker not only ensured longevity but also made it possible for people with heart complications to carry out daily activities since the device was implanted in the body. Another challenge was the short lifespan of the battery, which had to be changed every two years and proved painful and expensive for the patients. Greatbatch solved that problem, too, by inventing a special lithium battery, which lasted more than ten years. The modest hero Wilson Greatbatch went on to remark humbly, ‘If I didn’t do it, someone else would have’ and spent the money that came from his invention into funding further research to improve human health and advanced learning in every field, including his favourite—music. Wilson Greatbatch, the man responsible for the healthy fluttering beats of millions of hearts, felt the last flutter of his own on 27 September 2011.
Try to remain as calm as possible and then rest your palm on the left side of your chest. Measure the number of times your heart beats in a minute. This is called bpm, or beats per minute. A normal heart rate is between 60bmp and 100bpm.
While watching action movies, do you ever wonder how the bullets do not
penetrate the bulletproof vests the characters wear? It’s not because the movie is fiction. Real bulletproof jackets do exist. And it was invented by a woman! Kevlar is a strong synthetic fibre, which is heat resistant. It was developed by a scientist named Stephanie Kwolek (1923–2014) in 1965 while working at the Du Pont de Nemours, Inc. The woman behind the legend Stephanie Kwolek worked as a chemist at Du Pont for forty years. She is most well-known for developing a high-functioning family of synthetic polymers known as Kevlar. It all began in 1964 when Stephanie and her group were searching for a lighter and stronger fibre for making tyres. She
had been working on a polymer, which formed liquid crystal. Liquid crystal is combined or in a hybrid state when the substance is both in the liquid state and crystalline state. In the laboratory, this substance was usually thrown away. However, Stephanie was curious to find the potential of the solution. She requested the technician, Charles Smullen, to test the solution, which she had collected. The result was wonderful! She was amazed to find that the solution produced a fibre much stronger than nylon. She took her discovery to her supervisor who immediately recognized the significance of the new polymer Kwolek presented to him. Due to her findings, the fibre and its properties were thoroughly explored, and by 1971, Du Pont Inc. had developed Kevlar.
The nerdy name for the material is Poly-paraphenylene terephthalamide but referred to as Kevlar. The master of all trades The production of Kevlar was an expensive process. Moreover, the ultraviolet rays (UV-rays) of the sun could degrade the Kevlar fibre, and so, it was not suitable for use in direct sunlight. However, it is indispensable in certain usages. In cryogenics, Kevlar is used because it has low-thermal conductivity (it is heat-resistant). Another very important use of Kevlar is in the production of armour. Ballistic vests/bulletproof vests, ballistic face masks and combat helmets are made of Kevlar because of its high heat resistance, durability and lightness. For personal use, Kevlar is also a good material to make jackets, gloves, safety clothing for bikers, etc. The famous shoe manufacturer Nike launched the Basketball’s Elite II Series 2.0, using Kevlar in the front area and the shoelaces. Several bicycle companies also started manufacturing tyres made of Kevlar. It also has acoustic properties and, therefore, is used to make loudspeaker cones. Kevlar
is also used as a substitute for Teflon to make non-stick pans. Kevlar ropes and fibres are sturdy and are used in suspension bridges and in the protective sheath for optic fibre cables. It is also used in aviation as an alternative to carbon or fibre glass. Kwolek was awarded the Lavoisier Medal by the Du Pont company for her contribution as a ‘persistent experimentalist and role model whose discovery of liquid crystalline polyamides led to Kevlar aramid fibres’. In 1995, she was inducted into the National Inventor’s Hall of Fame and received the Perkins Medal in 1997. The Royal Society of Chemistry grants the ‘Stephanie L. Kwolek Award’ biennially ‘to recognize exceptional contributions to the area of material chemistry from a scientist working outside UK’. The inventor of Kevlar passed away at the age of ninety on 18 June 2014.
Kwolek was head of the polymer research team at Du Pont until she retired in 1986.
Is a vest made of Kevlar really bullet-proof? Well, not entirely. It depends on the speed at which the bullet is travelling as well as the distance. If the tip of the bullet is sharp and hard, it can still go through the vest!
20 Liquid Paper
What do you do when you make a mistake while writing? Do you cross it
out or use a correcting pen to cover the mistake? The latter is known as ‘liquid-paper’. The name ‘liquid paper’ is a popular name used for correction fluid, correction tape or correcting pens. It is invaluable for correcting mistakes while typing or writing. It was first invented by Bette Clair McMurray in 1956. Blot-out Bette (1924–80) worked as a secretary at the Texas Bank & Trust, typing away documents and letters on her electric typewriter. In the 1950s, electric
typewriters made typing easier and faster. However, as most things have disadvantage along with the advantages, the typewriter did too. It had a ribbon attached for the ink, which made it difficult to correct typing mistakes. As goes the saying, ‘Necessity is the mother of invention’, Bette desperately needed something to correct her typing mistakes. She had been interested in painting since her early childhood, a passion instilled in her by her mother, Christine Duval. One fateful day, she had a wild idea. She concocted a mixture of tempera paint and improved it to match the stationery used at the bank. She secretly applied a thin coat of the mixture on the typos that she made and began covering them up with the mixture. The result was so effective that her superiors had no idea she had blotted out her mistakes when she submitted her typed pages. Mistake out, success in Bette named her product ‘Mistake Out’ as it erased the typing mistakes. Before long, her paint mixture became an office secret. All the other secretaries started borrowing her ‘Mistake Out’. As the demand increased, she realized that constantly sharing her paint bottle was hampering her work. She began selling small bottles of it to other secretaries around the office. By 1956, it occurred to her that the product had potential, and she founded the Mistake Out Company in her kitchen and continued manufacturing and selling from home. She was so driven in her efforts to make her company successful that the bank was unhappy with her divided attention between her company and her job as a secretary. On one occasion, she accidentally signed a letter she was typing at work using the notation ‘The Mistake Out Company’. She was fired from her job for this negligence in 1958. However, she went on to become a successful entrepreneur!
In 2000, Newell Rubbermaid (now known as Newell Brands) acquired the product and the brand name of Liquid Paper. It is currently endorsed by
Papermate. Enabling others She changed the initial name ‘Mistake Out’ to ‘Liquid Paper’ and received a patent for her invention. She worked from her kitchen, employing her young son Michal Nesmith and his friend, paying them a dollar an hour to fill small bottles of the product. She then took these to wholesalers, and finally her hard work paid off. Her product started getting promoted in magazines, and such large orders started coming in that she needed to expand. In the next decade, she established the headquarters for her business at Dallas in Texas and went on to make more liquid paper and money. Ever the caring person, she did not hesitate to share her invention with her friends. She structured her company in such a way that all the employees were consulted in company decisions. Part of the money gained from Liquid Paper went to two charities—the Gihon Foundation, which gave financial support and grants to women in the field of arts, and the other was the Bette Clair McMurry Foundation, which also gave grants and financial help to women in business. Unfortunately, in 1975, her second marriage to Robert Graham ended in a bitter divorce. Graham tried to take the company away from her. However, she was able to take her company back despite her failing health, and in 1979, Bette Nesmith sold her company to the Gillette Corporation for a hefty sum of $47.5 million! Bette Claire McMurray, inventor, businesswomen, philanthropist and the woman responsible for covering many a mistakes, passed away on 12 May 1980.
Bette set up a library at the Liquid Paper Headquarters, along with a childcare facility for employees to bring their kids to work till the child is one year old!
Find out more about how Bette McMurray helped others succeed.
Matchsticks are used for starting fire. They come in small wooden sticks
stacked neatly in a box. One end of the stick is made of combustible (that can burn easily) substance. This end can be easily lit when struck against a surface. So there are two broad types of matchsticks. The first one is called ‘safety matches’ and the other is ‘strike-anywhere matches’. Safety matches can only be lit when it strikes a surface specially designed for it. The strikeanywhere matches can be struck against any surface in order to be lit. Strike! Jean Chancel (1899–1977) of Paris invented the first self-igniting match in 1805. This consisted of a head made of sulphur, potassium chloride, sugar and rubber. The matchsticks were lit by dipping the combustible head into a bottle filled with sulphuric acid. This model involved great risk in lighting, and it was very expensive.
The accidental match John Walker (1781–1859), a chemist from Durham, England, was the first person to successfully make a friction match in 1826. Walker was preparing a mixture for lighting when a matchstick dipped in the mixture suddenly ignited on contact with the hearth (the floor of a furnace). Walker was curious to know why. He began repeating the process and made matchsticks with the head dipped in sulphide, chlorate of potash and gum to hold the mixture together. He also added camphor to reduce the smell of the burning chemicals. Each box of matchsticks was attached with sandpaper for igniting the sticks through friction. The matchsticks invented by Walker were named ‘Congreves’ after Sir William Congreve, an inventor and rocket pioneer. The invention of Walker was not patented and it was possible for people to replicate his design and improve upon them. The devil’s matches In 1829, Sir Isaac Holden, a Scottish inventor, created an improved version of Walker’s matchsticks. Samuel Jones, a London chemist learned of the device through his son who happened to be Holden’s pupil. Samuel Jones patented the matchsticks and named them ‘Lucifer matches’—another name of the Biblical Satan. In 1830, Charles Sauria substituted the white phosphorus used in earlier versions with antimony sulphide (a natural crystalline mineral). He sold it as ‘Loco Foco’ in the United States. The match ban Chemical matches were not commercially successful because they were expensive for regular use, and there were safety hazards. One component in the flammable tip was white phosphorus, which caused phossy jaw or ‘phosphorus necrosis of the jaw’. It means that prolonged exposure to white phosphorus destroys the bones of the jaw. This led to its ban in several countries. The United Kingdom passed a law in 1908, banning the use of white phosphorus in matches from 1 January 1910. India and Japan declared the ban in 1919, followed by China in 1925. So, how were matches made then? Arthur Albright (1811–1900) solved the problem by using red
phosphorus, which is non-toxic. Red phosphorous is prepared by heating white phosphorus at a certain temperature. In 1844, Gustaf Erik Pasch (1788–1862) filed a patent for a striking surface for matches, which were made of red phosphorus. Meanwhile, Johan Edvard Lundström and his brother Carl Frans Lundström established a match industry in Sweden after obtaining the sample of red phosphorus from Albright. They began producing safety matches made of red phosphorus from 1858. The Lundström brothers held monopoly of matchstick production for the most part of the nineteenth century.
Phillumeny is the hobby of collecting matchboxes or match-related items. Fire on the go The invention of matchsticks made it possible for us to carry our own portable source of fire. With the introduction of lighters, the portable fire was even more convenient. However, the popularity of matches remains intact. It is now a common household item modified to suit specific needs. The storm matches, for instance, is specially built to burn brighter and longer for use at sea. Phillumenists collect different boxes of matches as a collector’s item. The name literally translates to mean ‘loving light’. Maybe, the next time you come across a box of matches, you can start your own collection! But be careful to take out the matchsticks from their box, as they are still pretty dangerous to play with.
In 1888, there was a Matchgirls’ Strike. Try and find out more about the incident.
22 Microwave Ovens
Microwave ovens are a necessary appliance for the modern household. It
cuts down the amount of time spent in food preparation. It is a blessing for the modern-day kitchen. So, how did this wonderful invention come into being? The story is an interesting one because of the accidental way in which it was invented—due to the curiosity of a hard-working man. Beating the odds Percy L. Spencer (1894–1970) was born on 19 July 1894 in Howland, Maine, USA. His childhood was a difficult one. His father passed away when he was very young and his mother abandoned him to the care of his aunt and uncle. After his uncle’s death, financial constraints compelled him to drop out of school at the age of twelve. Young Percy worked at a spool mill to support himself and his aunt. Although without a formal education, Spencer was intelligent and he was determined to succeed in life. The electric age
By the early twentieth century, some factories in the United States began using electricity to operate their machinery. Spencer came to know that the local paper mill was about to be electrified, too. He immediately applied for the job and started learning everything he could about electricity by himself. In 1910, materials on electricity were very rare, and he had to rely on trial and error. He was selected despite his lack of a degree in electrical engineering or any degree at all. After working at the paper mill till the age of eighteen, a historic moment in history changed the course of his life. In 1912, the sinking of the RMS Titanic shocked the world. Spencer drew inspiration from stories of bravery exhibited by the crew aboard, particularly that of the Radio Boys. He enlisted in the US Navy and enrolled in their radio school to learn about wireless telegraphy and wireless communication. His time in the Navy established him to be a valuable asset.
The RMS Titanic was a ship which sank on its maiden voyage from South Hampton (England) to New York on the night of 14 April 1912. More than 1,500 lives were lost. Perseverance pays After being discharged from the Navy, Percy L. Spencer briefly worked for Wireless Specialty Apparatus Co. in Boston. There, the curious young man often stayed up all night figuring out how things worked. Between the late 1920s and 1930s, through hard work and determination, the grammar school dropout from Maine had become a leading expert in radar tube design and worked for Raytheon.
Radar is a system used for detecting the presence, position, speed etc. of an object by emitting radio-waves. These waves are reflected back from the object to the source. Raytheon was on independent defence company working as contractor for the US Department of Defence. It was Spencer who played a key role in obtaining a government project to produce combat radar equipment at Massachusetts Institute of Technology’s (MIT) laboratory. During that time, vacuum tubes, called magnetrons, were used to produce microwave signals for the functioning of radar sets. They were vital for warfare and played a great part in making the RAF (Royal Air Force) more efficient. However, producing the equipment was a slow and laborious task, and an average of only 17 magnetrons could be produced a day. Spencer improved on the equipment by ingeniously coming up with a better device and was able to increase production to 2,600 per day. Whenever others saw a problem, Spencer would see an opportunity as he tinkered away in his laboratory to know the reason behind things, working the way they did and how they could be improved. This was the quality that would lead him to his most famous invention—the microwave oven. The itch to know While producing magnetrons in the laboratory, Spencer noticed that the candy bar he had in his pocket had melted. This was unusual because candy bars were not supposed to melt at room temperature unless exposed to heat or sunlight. It piqued his curiosity, and he became intent on finding out the reason behind this strange phenomenon. He wanted to find out more, and so, he sent an assistant to bring popcorn kernels.
Lightning is a natural form of static electricity. To find out more about electricity, look up Benjamin Franklin and his famous kite experiment. As he placed the kernels near the tube, moments later, the kernels began to crack and pop, and they became the first microwaved popcorn! He knew he was on the verge of discovering something and conducted more experiments. An egg was put inside a kettle and the kettle was then placed near the tube. Joined by his curious colleagues, they all watched as the kettle began to rattle due to internal pressure and as a curious colleague looked inside the kettle, the egg exploded and splattered cooked yolk and albumin onto his startled face. The success with the popcorn kernels and the egg led to a series of experimentation with other foods. Spencer also came up with a new model in which microwave energy was fed into a confined metal box. Engineers at Raytheon began working on the prototype, and on 8 October 1945, the company filed a patent proposing the use of microwaves for cooking. However, these units were not as portable and affordable as they are now. They were bulky and expensive and 5.6 feet tall! One unit was approximately 5,000 USD, which would have been an enormous sum at that time and owing to their sizes and their costs, these ovens were mainly installed only in restaurants. It was only a matter of time before its efficiency would make it an essential kitchen appliance. After several tests, in 1947, the first commercial microwave oven called ‘Radarange’ was introduced in the market with a price range of 2,000–3,000 USD. Although the initial response to the appliance was not positive, further improvements were made, and it rapidly gained popularity. 1967 saw the introduction of the first countertop oven.
In the 1970s, microwave ovens became so popular that sales exceeded those of gas ranges. This was also due to the product becoming better and cheaper, as well as growing acceptance of its benefits among consumers. Curiosity has no expiry date In 1958, Don Murray in his Reader’s Digest article, ‘Percy Spencer and His Itch to Know’, described his meeting with the ever-curious inventor. Spencer had made Murray take his shoes off so that he could examine exactly how they were made. This was the kind of curiosity that made him chance upon an invention that was to help mankind in their culinary endeavours. The orphan from Maine rose to the position of senior vice president at Raytheon and continued serving as senior consultant until his death in 1970 at the age of seventy-six. Despite his humble beginnings, Percy L. Spencer was able to forge his path through sheer grit. During his lifetime, he held more than 100 patents and was awarded an honorary Doctor of Science degree from the University of Massachusetts and the Distinguished Public Service Award from the US Navy; he also became a Fellow of the Academy of Arts and Science and a member of the Institute of Radio Engineers. In 1999, Dr Percy LaBaron Spencer was posthumously inducted into the National Inventors Hall of Fame.
This is one beautiful ‘rags to glory’ story, isn’t it? Can you think of more such stories?
23 Modern High Heels
Are you familiar with the story of Cinderella and the glass slippers? What if
Cinderella had not left them behind? The prince would have had a hard time finding her! That is the fairy-tale version. Now, let’s find out more about this invention for which girls choose style over comfort! Ancient heels? In the Ramalingeswara Temple in the state of Telangana, India, among the many statues is one in which a woman’s foot seems to be covered with a raised shoe. Could the woman be wearing an early form of high heels? We may never know for certain, but high heels have been around for a very long time. In the Persian Empire, shoes with elevated heels were worn to hold the rider’s balance so that he could shoot an arrow while being mounted on a horse! High heels were not the symbol of poise and beauty it is associated with today. They were an important part of the warrior’s costume, although the modern pointed stilettos look just as lethal. It might come as a surprise that high heels were not exclusively worn by women in the olden days as they are now. Can you imagine grown men strutting around in high heels?
Probably not, but times were very different then and the beloved high heels have come a long way since. For the love of heels You can’t help but look in wonder when you see fashion models on the runway walking with high heels. Are they comfortable? Most likely not. It restricts speed, safety and comfort but it sure looks beautiful! It is fair to admit that a pair of high heels is not the most practical invention even though they are beautiful and stylish. You can’t help but wonder why they are worn when you see fashion models on the runway walking with high heels. But then, fashion is not so much about making sense, is it? So, why do women wear high heels? Heel story In 1599, when anything Persian was seen as new and exotic in Europe, Shah Abbas, the king of Persia sent his emissaries to Spain, Russia and Germany. These emissaries were believed to have introduced shoes with slightly elevated heels to the European courts. The aristocratic class were all too eager to try something other than their native fashion styles. As these were worn by the aristocratic class, it also became a symbol of wealth and status. It was only a matter of time before the lower ranks of the society started imitating the style.
In 1649, the British decided they didn’t want their king anymore and beheaded him to set up a Commonwealth nation. They realized the Commonwealth wasn’t very good either and wanted their king back. But they had already killed the king, so they brought his son, who had been hiding in France, to be made the king. This event and period after the return
of Charles II to the throne of England is known as Restoration. How did the aristocratic class respond to this ‘copycat’ business? They dramatically increased the height of their heels! The higher the heels, the greater the social standing of the wearer. So, it was pride that led to the accidental invention of high heels. Not to forget that these were worn in muddy sixteenth-century streets! As mentioned earlier, heels were worn by both men and women. For instance, high heels were a blessing for Louis XIV of France who was only 5.4 feet tall! The king was believed to command more respect if he was able to look down on the people he addressed instead of looking up. As anyone and everyone who could afford high heels began wearing one, Louis XIV did not like it. He passed a law that only members of the court were allowed to wear the red-coloured heels. The colour of the heels became a mark of distinction once more. What happens in France does not stay in France. It becomes a trend. Of course, England picked up on the heels too! In 1661, during the famous Restoration, the six-foot Charles II wore heels to his coronation. Why did men stop wearing heels? The Enlightenment of the eighteenth century marked an end to high heels for men. It brought dramatic changes in fashion for both men and women. The extravagant and colourful were discarded in favour of more solemn colours. High heels became associated with what they thought were feminine qualities of frailty, foolishness and ignorance. Men naturally gave up wearing high heels. They didn’t want to be associated with anything feminine. High heels became associated with feminine desirability and beauty.
The Enlightenment is the period of philosophical reflection and
investigation that laid the foundation for modern philosophical thoughts and science. In the twentieth century, women still wear heels as a style statement. A pair of Jimmy Choo designs or the red soles of a Christian Louboutin pair are never out of fashion. In 2019, actress Kristen Stewart of Twilight fame made a bold move on the red carpet of the Cannes Film Festival, which has a strict ‘only heels’ policy. She removed the pair of Christian Louboutins she was wearing and walked barefoot! The modern high heels are associated with women’s beauty on the one hand and that of objectification of women’s body on the other. So what should the modern Y-generation woman do? To wear or not to wear? The answer is wear what makes you happy. The choice is yours.
Imelda Marcos, First Lady of Philippines (1965–86) was said to own more than 3,000 pairs of shoes!
In 2018, the Passion Diamond Shoes was launched by the brand Jada Dubai. They were the most expensive pair of shoes in the world, made from real gold and diamonds. It was worth $17 million! That’s equivalent to about ₹120 crore!
What does the word ‘monopoly’ mean? It means a market situation where
one producer controls the supply of a good or service. Monopoly is a board game where players trade properties and try to accumulate wealth. It is a game which tests the players’ skill in business. The name ‘monopoly’ is used as players try to gain control of the market for their own gain and make their opponents bankrupt. The game consists of twentyeight properties, four railroads and two utilities, twenty-two streets, three chance spaces, three community chest spaces, a luxury tax and income tax space and the four squared corners that indicate: ‘GO’, ‘Jail/Just Visiting’, ‘Free Parking’ and ‘Go to Jail’. Many subsequent changes and additions were made to the game, along with updates. Different versions of the game were adapted to suit the varied tastes of Monopoly players.
Accidental genius or monopolizer? Charles Darrow (1889-1967) is credited with the invention of Monopoly. He was the first game designer to become a millionaire. He was said to have come up with his invention after losing his job during the Great Depression, which began in 1929. The story of Darrow’s invention describes how he would often see neighbours playing homemade board games. They often played with their families. It gave him the idea of designing a board game, which would not only be fun but also instructive. With help from his wife and son, Darrow invented what we now know as Monopoly. The game was a runaway success and was played in hundreds of American households. Darrow sold the game to the Parker Brothers, and it made the Parker Brothers rich and turned Charles Darrow into a millionaire. Monopoly spy In 1936, Monopoly was licenced to be sold outside the US. In 1941, during the Second World War, the licenced manufacturer of the game in Britain, John Waddington Ltd., was requested by the Secret Intelligence Service to make a special edition. The edition would have real money, maps and other items added to help prisoners of war escape.
The British Secret Intelligence Service, also known as MI6, was made popular by the character James Bond who works for the Service in the James Bond movie franchise. Magie’s game Charles’s idea for the game was not his own. He got the idea from an older version of the same game invented by Elizabeth J. Magie (1866–1948), called
‘The Landlord’s Game’. Magie designed the game based on the theory of Henry George. She was granted the patent on 5 January 1904 and another one for a revised version of the game in 1924. She invented the game to show the adverse effects of gaining total control over land on the economic stability of the country. She disliked the actions of rich industrialists such as Andrew Carnegie and John D. Rockefeller, who grabbed land because they could afford to. She knew that an open attack against such financial titans would be ineffective.
Magie was inspired by economist Henry George. He proposed to tax wealthy landowners to increase the economy and ensure the survival of the landless poor. Therefore, she created two sets of her game. The first set used the antimonopolist rules, where everyone was rewarded if they did not try to gain control over each other. The second set is the one that we now associate with Monopoly, where one tries to grab the property of others. Magie’s game was meant as an instructive tool, which would show that the first anti-monopolist ideology was a moral and just thing to do. She planned to win over the minds of the public by showing them that the first anti-monopolist mode of playing would benefit everyone. Game of life Elizabeth Magie was many things—a newspaper publisher, a writer, an abolitionist, a feminist, a stenographer and a game-inventor, and often described as a bold and progressive woman. In 1902, Magie said of the game, ‘It is a practical demonstration of the present system of land-grabbing with all its outcomes and consequences. It might as well have been called the Game of Life, as it contains all the elements of success and failure in the real world,
and the object is the same as the human race in general seem to have, i.e., the accumulation of wealth.’ Whose game is it anyway? Magie’s ‘The Landlord’s Game’ was popular among a select few. They modified the names to match properties around their neighbourhood. It is highly likely that the game, which inspired Darrow, was one such version of Magie’s game. However, the credit for the innovative game went to Charles Darrow. That was until Ralph Anspach came into the picture. In 1973, Economics Professor Ralph Anspach published his version of the game, much like Darrow did. He called it ‘Anti-Monopoly’ and was promptly sued for trademark infringement by the Parker Brothers—the guys who published Darrow’s Monopoly. In order to defend himself, Anspach conducted extensive research. He found out about Magie’s patents and the similarity between her game and Darrow’s Monopoly. Anspach’s determination to tell the story of ‘the Monopoly lie’ bestowed upon Elizabeth Magie her fair share of credit. However, the financial gain made Charles Darrow a millionaire. The unfortunate outcome can be seen as a stark example of the very issue that Magie wanted to address when she first invented the game. Over the years, Elizabeth Magie has been getting more credit for her invention. She lived life on her own terms and married Albert Wallace Phillips at the unconventional age of forty-two. It is left to us to look at the facts and decide for ourselves to whom the credit really belongs to. Who do you think was the real inventor?
Next time you take out your Monopoly set to play, try the first set designed by Magie where no one tries to grab other people’s wealth but share. It might be more fun!
Paper is one of the most common stationery. It is made of cellulose fibres
from bamboo, grasses, wood, rags, etc., and is used for drawing, writing, printing, wrapping and many other things. Can you guess which country first invented paper? Hint: They invented silk and gunpowder. And, they have a ‘great’ wall, which is used as defence. A tiny story of its origin In the second century BC, the first form of paper was believed to have been invented by accident. According to popular lore, clothes made of hemp (a kind of cannabis plant, which can be used for making cloth and paper) were
left to soak in the water for too long. This resulted in a residue from the clothing being left in the silt at the bottom. This residue was collected to form a thin layer of a crude form of paper. However, the invention of paper was already attributed to Cai Lun, a eunuch in the court of the Han ruler. In 105 BC, hemp paper made from plant fibres, which were soaked and pressed, were the common medium for writing instead of expensive silk or wooden strips. China was secretive about the art of paper-making much like the technique of silk weaving, and it was several centuries before the knowledge spread to the rest of the world. In the thirteenth century, the knowledge of paper reached Europe via the Middle East, mainly Baghdad, and hence, paper was called ‘Baghdatikos’.
Paper is derived from the Latin word, papyrus. The industrial revolution in Europe, in the nineteenth century, considerably reduced the cost of production, thereby lowering the price and making it more accessible to the masses. The invention of paper contributed in spreading knowledge, and it made books and literary works more accessible. It also influenced the rise of the three kinds of Chinese art— poetry, calligraphy and painting, which established the predominant art forms in China.
The printing press was invented by Johannes Gutenberg in the fifteenth century.
Around the eighteenth century BC, the introduction of block printing increased the demand for paper. The major consumers of paper were Buddhist scholars. Paper was a prized commodity and people even paid taxes and tribute in paper during the Tang dynasty in China. Paper money also made its appearance as trade expanded and barter system was no longer a convenient system of commerce. The invention of the printing press revolutionized the spread of literature and mass communication. Side-effects Although inventions are intended to improve the standard of life, oftentimes, it so happens that the consequences are not necessarily beneficial. In the case of paper, the pulp from which modern paper is obtained is made by cutting down trees in jungles. Not only are these trees a part of the ecosystem, they also provide shelter to wild animals and birds. The long-term practice of these methods of cutting down trees is also known as deforestation. Deforestation has very serious impact on our environment. To meet the everrising demands of paper, more and more areas of the forest need to be cleared and the natural vegetation of these areas is destroyed. We must also consider the number of trees destroyed in wildfires or natural disasters, which is still lower than the damage caused by large-scale deforestation. Intervention The immediate concern in the twenty-first century is the issue of climate change. Massive deforestation, mining and other activities, which we humans engage in, have adverse effects on our natural resources. Their effects can be seen directly and indirectly. Lakes and natural sources of water are drying up and rapidly depleting and trees are disappearing at the hands of big companies and their tractors. These are some of the direct effects, while indirectly, we often hear cases of wild animals invading human settlements. This is because their habitat and hunting grounds are gone, and they are desperately looking for means to survive. Many of our rivers have lost their natural beauty and the ones that have not dried up have turned into drains. A quick look at the rivers in India—for instance, the Yamuna and the Ganga River—will give you an idea about the extent of devastation that we have carelessly carried out on the things our lives depend on.
Therefore, with every new technology or invention that comes along, we must always ask the important question: What are its merits and demerits? Once we find the latter, we must set on the path to prevent or solve it. Recycling paper and cutting down on the use of paper are some ideas to fight deforestation.
Brainstorm more ideas to save paper with your friends. Put them down and start acting on them!
Do you remember when you first learned to write ABC? What did you use?
The most common answer for most of us would be a pencil. The word ‘pencil’ is derived from the Latin word ‘penicillus’ meaning ‘little tail’. The early form of pencil used by the Romans was called ‘stylus’. The older form of pencils used a lead core. The modern form of pencil was invented by a Frenchman, Nicolas-Jacques Conté (1755–1805) in 1795. The core of the pencil is composed of pure carbon called graphite. Where it all began In 1564, the small town of Borrowdale, in England, was found to contain
lumps of pure graphite. The graphite would leave dark marks when used on a surface. However, it was so soft that it needed to be tied in a string to prevent it from breaking. The popularity of the graphite core pencil rose rapidly and the British Crown took control of the mines in which graphite deposits were found.
Graphite is made up of carbon. If exposed to very high pressure, it turns into diamond, which is also another form of carbon. Invention under pressure England had monopoly (remember Monopoly?) over the graphite trade. Neighbouring countries, such as France, depended on graphite imported from England. However, the Napoleonic wars (a series of wars between France and other European countries in the nineteenth century) ceased the supply from England. France was forced to find an alternative to this essential element. This task fell upon Nicolas-Jacques Conté. He began looking for ways to compensate for the boycotted product. After several attempts, Conté decided to mix graphite with clay and water. He placed the mixture in a kiln at 1900 degrees Fahrenheit. The mixture became soft and ductile. He then encased the mixture in a wooden cylinder. Once it cooled down, the solidified core formed a clear black mark when used on a smooth surface. The case prevented the core from breaking off and it was much easier to hold while writing. This was to become the first model of the pencil that we use today. Conté patented his invention in 1795.
‘Stylus’ is made of a thin metal rod which leaves a trail on the ‘papyrus’ used for writing. As you know, papyrus is the earliest form of paper, made from the papyrus plant. Leaving a mark Pencils evolved over time and their softness or hardness could be adjusted by the ratio of graphite to clay in the mixture. The shape of pencils also vary according to their requirements. The cylindrical shape is the most convenient. However, carpenters prefer the polygonal- or triangular-shaped pencils because the cylindrical ones tend to roll off surfaces.
Curiouser and curiouser! Although pencil contains graphite, it is not a pure form of carbon as it is mixed with clay and other substances. So, it is unlikely that it will turn into diamond!
Penicillin is an antibiotic, which prevents the growth of harmful bacteria in
the body of living organisms. Some infections caused by bacteria are pneumonia, eye infection, etc. Antibiotics are different from antiviruses and do not work on viral infections. It is important to note the difference between the two. Although older forms of antibiotics were used to cure infections, modern medicine considers Penicillin to be the first antibiotic developed to treat bacterial infections. To whom does the credit go? The credit goes to Nobel laureate, Sir Alexander Fleming (1881–1955), a Scottish physician and microbiologist. Fleming was born on 6 August 1881 in Dravel, Scotland. He was an intelligent child and earned a scholarship to
Kilmarnock Academy, one of the finest boarding schools in Scotland. At the age of twenty, Fleming followed the footsteps of his elder brother. He joined the medical profession. An unexpected inheritance provided him the means to attend St. Mary’s Hospital Medical School in Paddington, Westminster. Twenty-five-year-old Fleming earned a medical degree with distinction in 1906. He stayed on in St. Mary’s research department as assistant bacteriologist to Sir Almroth Wright. He became a part of the teaching faculty at St. Mary’s after obtaining his Bachelor of Science degree in bacteriology in 1908. He served in the hospital until the outbreak of the First World War when he served as a captain in the Royal Army Medical Corps (RAMC). After the war, Fleming returned to St. Mary’s, and in 1928, he was elected Professor of Bacteriology at the University of London. Inspiration During the war, as a part of the Medical Corps, Fleming witnessed the death of countless soldiers. The antiseptics, which were available at that time, were not effective enough. More soldiers died from infected wounds sustained in battle than the battles themselves. Fleming resolved to find a better treatment for such wounds and continued intensive research. Fortune favours the absent-minded A fortunate stroke of luck awaited Fleming on 3 September 1928. He had been on a long holiday and was eager to get back to work. Before leaving, he had been working on the bacteria staphylococci. On entering his laboratory, he noticed that he had forgotten to cover one petri-dish of his cultures of the bacteria staphylococci. Inspecting the petri-dish further, he realized that the culture was contaminated by a kind of mould or fungi through the open window. The culture exposed to the fungi was behaving strangely! The fungi seemed to have destroyed the colonies of staphylococci it came into contact with while the ones that it was not in touch with seemed to have survived. To investigate further, Fleming started growing the mould himself. He found that the mould was able to kill some bacteria, which were responsible for causing the disease. However, he was unsure about its potential because of the slow growth rate of the mould. It took a long time to show effect.
Fleming initially named the mould as ‘mould juice’ until he christened it as penicillin in 1929. The silent years Fleming published his findings in 1929. However, it would be many years until Fleming’s Penicillin was able to create any impact. One reason was because of his personality. The scientific world paid little attention to what the quiet and unassertive bacteriologist had to say. It would take the intervention of bacteriologist Howard Florey and biochemist Ernst Boris Chain to further advance Fleming’s discovery. The duo purified and stabilized penicillin in 1940.
Antibiotics are bacteria busters!! Lifesaver As the world entered into WWII in 1939, the team of Florey and Chain were delegated the task of completing their much-needed antibiotic. After successful human trials, penicillin was put into mass production for use in the war effort. By the time the Allied forces landed in Normandy on D-Day in 1944, there was enough supply of penicillin to treat the wounded. This greatly reduced the number of soldiers dying from infection.
Sir Ian Fleming In 1943, Fleming was made a fellow of the Royal Society, whose members included Sir Isaac Newton, Charles Darwin, and recently, Stephen Hawking. He was knighted by George VI in 1944 and became Sir Alexander Fleming. In 1945, he was awarded the Nobel Prize in Medicine, along with Florey and Chain. Elected as the rector of Edinburgh University, he served in office from 1951 to 1954. Penicillin may have become a wonder drug, which revolutionized medicine and healthcare, especially in the first half of the twentieth century, but Fleming was modest enough to downplay his part. The possibility of stumbling upon an invention or discovery is always there. However, Fleming had the knowledge, skill and insight required to make use of change and turn it into a success. Fleming was said to have famously commented: One sometimes finds what one is not looking for. When I woke up just after dawn on 28 September 1928, I certainly didn’t plan to revolutionize all medicine by discovering the world’s first antibiotic, or bacteria killer. But I guess that was exactly what I did.
Some people are allergic to penicillin. Make sure you apply it onto your skin to test it before using the antibiotic.
Look around you. How many plastic objects can you see? Plastic is found in
the greatest number in our surroundings. The wrappings of eatables, chairs, tables, etc., are mostly made of plastic. Plastic is made of polymers, which can be moulded into any shape. When it was first invented, the concept behind plastic was to make an affordable and sustainable raw material for production. How did that plan work? We only have to look around us to see what we do to things that are made to benefit us. An American dream Synthetic plastic was invented by a Belgian named Leo Henricus Arthur Baekeland (1863–1944). He was born in Ghent on 14 November 1863. He studied Chemistry at the University of Ghent on a merit scholarship and obtained his PhD at the age of twenty-one. In 1889, he and his wife visited America on a travel scholarship. They settled there as he found employment in New York. He went on to establish the Nepera Chemical Company in
Nepera, New York, along with Leonard Jacobi and Albert Hahn. With the money from the sale of Nepera, he bought a house in Yonkers and set up a home-laboratory and enrolled in the Technical Institute at Charlottenburg, Germany, in 1900, for a refresher course in electrochemistry.
A polymer is a large molecule that is made of many repeated smaller molecules. Plastic In 1907, the polymeric property of natural resins and fibres were widely explored. He worked on the reactions between formaldehyde and phenol. He went on to further study their properties under controlled temperature and pressure. This process formed a mouldable material, which he named Bakelite. He then founded General Bakelite Company and manufactured Bakelite in his laboratory. In 1922, he won a patent lawsuit and merged other companies he founded to be a part of the Bakelite Corporation. Unlike other plastics that existed before, Bakelite was in greater demand because it retained its shape even when it was heated. Baekeland’s invention marked the beginning of the ‘age of plastics’, which is ironically still true. Bakelite’s plastic also forms the fundamental component of telephones, radios and electrical insulators and resistance to heat and its usage spread to various industries and gradually gained foothold in the lives of consumers.
The chemical name of Bakelite is polyoxybenzylmethylenglycolanhydride!! Can you say it? I bet you can’t!
Baekeland is also known as the ’Father of the Plastics Industry’ for his contribution to modern plastic. Leo Baekeland was awarded the Perkin Medal in 1916, Franklin Medal in 1940 and was posthumously inducted into the National Inventors Hall of Fame in 1978 after his death in 1944. In the past decades, plastics have become a bane for society and we must try and avoid using plastic.
Find out how plastic products are harmful for our environment.
Can you believe the famous Play-doh was once upon a time used for
cleaning wallpapers? Well, it’s true. Play-doh is made of non-toxic, putty-like material, which can be moulded into any desired shape. It is a very popular toy among children belonging to the toddler to pre-teen age group. It is also an effective educational tool as it helps children explore their creativity. However, this product, which later became children’s toys, was never intended to be one. It was designed to remove soot and clean wallpapers. Let’s find out more about this accidental toy. Kutol products Noah McVicker, the man who invented the dough worked in his family’s company, Kutol Products, in New Jersey. It was a company, which manufactured soap and household cleaning products. In 1955, the supermarket chain Kroger Grocery requested the company to supply them with a product, which could be used for cleaning soot from wallpapers. Noah
designed a putty-like substance named, ‘Kutol Wall Cleaner’. It proved to be very effective. However, people started burning less coal in their homes as it emitted smoke and soot. The end of the Second World War also ushered in the use of natural gases for heating and cooking as well as the invention of vinyl-based washable wallpapers. This led to a decline in the demand for the cleaning substance. Manufacturers, including Kutol Products, stopped making the cleaning putty. Around this time, Joseph McVicker, Noah’s nephew, joined Kutol Products in an attempt to salvage the company, which was facing bankruptcy. Cleaning to classroom Joseph’s sister-in-law, Kay Zufall, worked in a nursery school in Dover. The potential of the putty appealed to her after reading a newspaper article about how the cleaning putty could be used for art projects for children. She tried it on her students and to her delight, they loved it. However, since the product was no longer manufactured, it was hard to come by. Then, Zufall realized that her brother-in-law worked at Kutol Products. Armed with contagious enthusiasm, Zufall managed to convince the uncle-nephew duo, Noah and Joe McViker, to repackage and market ‘Kotol Wall Cleaner’ as a children’s creative toy. Of course, Zufall knew about the non-toxicity of the dough before she introduced it to her students because the main components were not harmful. Sadly though, after a long battle with Alzheimer’s disease, Kay Zufall passed away on 18 January 2014. Play-doh McVicker and his associate, Bill Rhodenbaugh, began remanufacturing the dough. However, the product’s goal and target had shifted dramatically. They were now manufacturing toys for children. The name intended for the product was ‘Rainbow Modeling Compound’, which did not seem like a tempting name for a toy. Kay Zufall stepped in once more, along with her husband Robert Zufall, and they came up with the name Play-doh, where ‘doh’ is a similar sound to ‘dough’. The next step was to bring ‘Play-doh’ into the lives of American children. They came up with a very good marketing strategy. Offering him a percentage from the sales, they convinced Bob Keeshan to use Play-doh in his children’s television show, ‘Captain
Kangaroo’. This was a masterstroke as sales of the product soared, and it continues to maintain its popularity till date.
In 1998, Play-doh was inducted into the National Toy Hall of Fame. Forgotten hero Though the credit for the invention goes to Noah and Joe McVicker, along with Bill Rhodenbaugh, the foresight and mastermind was Kay Zufall. She and her husband did not enjoy any financial gain from the famous toy she helped introduce to the world. The couple set up a foundation named, Zufall Health Centre, to provide affordable and much-needed care to the economically weaker sections of Dover’s residents.
The next time you want to buy a gift for someone young, you know what to get! And maybe, you can tell them the story of its invention too!
30 Pneumatic Tyres
Do you know how to ride a bicycle? Bicycle wheels used to be just round
metal frames, without the rubber tyres. Imagine how uncomfortable that must have been? We don’t have to worry about that because of the invention of pneumatic tyres. These tyres have circular rubber tubes, filled with pressurized air. This makes them comfortable as well as fast. Who invented it? A dentist named John Boyd Dunlop (1840–1921) did! He was born in 1840, in Scotland. A veterinarian by profession, he worked as one for nearly a decade until he moved to Ireland in 1867. In 1871, he married Margaret Stevenson with whom he had two children, a girl and a boy. The invention, which he came to be renowned for, was the pneumatic tyre. The tyres that we use for transport might have suffered a setback of a few decades had it not been for this innovative father. In 1887, at their home in Belfast, Dunlop was watching his ten-year-old son, John, ride his tricycle. The young boy was a sickly child, and the doctor had prescribed the rides as a form of exercise to improve his health.
However, Dunlop noticed that the strain of trying to keep the tricycle running was too exerting for the young boy. Bicycles and tricycles, during those times, had thick solid rubber or metal tyres. They were uncomfortable and very slow, requiring a lot of effort from the rider to push on the pedals, especially on rough surfaces. As the rubber surface of the wheel touched their cobbled yard, the wheel would come to a halt and the boy had to pedal incessantly to move at all! Dunlop wanted to help his son. Suddenly, an idea came to his mind, and he started an on-the-spot experiment. Taking sheets of rubber from his workshop, he glued them together to form a tube. He then coiled the tube around the metal rim constituting the wheel’s circumference and then inflated the tyre with air using a football pump. Taking a normal metal wheel and his inflated-rubber wheel, he rolled them both in his yard. The metal wheel came to a halt shortly after, while the inflated wheel rolled on until it hit the gatepost of their compound! He replaced the metal wheels of the tricycle with his newly invented pneumatic wheels, and the first pneumatic tyres were born! The race to success Dunlop was confident that his tyres had a much larger potential. After several tests, in 1888, he filed a patent for his invention. In 1889, on Dunlop’s suggestion, William Hume used Dunlop’s tyres to win races in Ireland and England. He went on to win the four cycling events at Queen’s College in Belfast. This was a unique feat and immediately caught the attention of Harvey Du Cros, the president of the Irish Cyclists’ Association, when his sons lost the race to Hume. Du Cros approached Dunlop, and together, they set up business selling Dunlop’s invention. Dunlop had already debuted his rudimentary pneumatic tyres on bicycles, and it proved to be a success and orders flooded in from eager cyclists.
Modern tyres are now customized to suit the terrain. Mud-tyres with studs on the rubber prevents vehicles from skidding on snow-tracks and tread on soft mud. Pneumatic tyres have revolutionized the modern wheel and are used in cars, buses, aircraft, heavy-duty trucks, ATVs (All-Terrain Vehicle), motor-racing, etc. metal and rubber tyres continue to be used on trains and household appliances like lawnmower. The pneumatic tyre and booth’s cycle agency Dunlop was not aware that another Scotsman, Robert William Thomson, had already filed a patent for the same idea in 1846 in France, and in 1847, in the United States of America, a good many years before Dunlop himself stumbled upon his idea. Although Dunlop was the first to make a practical pneumatic tyre, because of Thomson’s patent, he was informed in 1890 that his patent was invalid. Du Cros stood by Dunlop when the latter lost his patent, and they revived a company and named it the Pneumatic Tyre and Booth’s Cycle Agency. Dunlop was able to commercialize his invention, and within a short span of time, in 1890, his first tyre plant was established in the city of Dublin. He retired from his veterinary practice in 1892, and by 1893, he was expanding his business to Europe, where he set up a factory in Hanau, Germany. Road revolution Dunlop’s invention came at a crucial time in history when road transport was gaining importance as a convenient and faster form of local transport. It would be a few more decades before the 1885 invention of the first commercial automobile by Karl Benz becomes accessible to the larger public, who continue to rely on horse-drawn carriages and bicycles. In 1900, Dunlop Pneumatic Tyres manufactured its first automobile tyres. In 1895, Dunlop sold his company and retired from the pneumatic tyre business even as the company made headway into Australia and his tyres were sold in France and Canada. Du Cros sold the entire business to the British tycoon Ernest Terah Hooley for 3 million pounds in 1896. Dunlop did not procure much financial gain from his invention or his pneumatic tyre business, but he lived to see it change the way road transport was carried out.
Hooley sold the company for 5 million pounds under the name Dunlop Pneumatic Tyres. This was later changed to Dunlop Rubber in the twentyfirst century. In 1898, the business had grown to be too big for its Dublin base, and so, the company headquarter had to be shifted to Coventry, England. In 1902, the base again had to be shifted to what came to be known as Fort Dunlop in Birmingham, England. Du Cros remained as the head of business and oversaw the manufacture of the first car tyres in 1900 and established 50,000 acres of rubber plantation in Malaya, in 1910. The company played a major part in the First World War when it began to manufacture aircraft tyres, and Du Cros financed the first airship for the British military. The Dunlop legacy After retiring from the company, Dunlop lived a simple life running a drapery business until his death in 1921 at the age of eighty-one. He was buried at Deans Grange Cemetery in Ireland, the country that had been his home for almost his entire life. The company that he founded may no longer exists, but his name and contribution to modern technology has been immense, and the Dunlop name endures. It is still a familiar name in motor-racing and tennis, because the company produced tennis balls. He did not receive much financial profit from his invention but his name came to be synonymous with pneumatic tyres. In 2005, eight-four years after his death, Dunlop was posthumously inducted into the Automotive Hall of Fame. The quiet veterinarian left behind an invention that enhanced our mode of transport. His name is remembered as one of the innovators that changed the world, all because he was trying to give his son a smooth tricycle ride!
The very first pneumatic tyre Dunlop invented, is displayed in the National Museum of Scotland!
Which is your favourite season? For most people the answer would be
summer, despite the heat. Indian summers are a myriad of monsoon rains, blistering heat, delicious street-foods and long summer vacations. Of all these, none is as gratifying as the pleasure of licking and munching on popsicles of your favourite flavour as you head home from school. When we think of inventors, the common image that comes to mind is that of old men in lab coats with unkempt hair (Einstein, remember?) who go about in a trance-like state, taking notes and doing experiments all day. In this chapter, let us discover who invented this slurpilicious summer treat to beat the summer heat. Frozen
Once upon a time, there was a boy named Frank Epperson who lived in California in the United States of America. One cold winter evening, Frank sat on the porch of their house, drinking a glass of soda water. The chilly winter breeze made him shiver from the cold, and he went inside to seek shelter in the warmth of the fireplace. In his hurry, Frank left his glass of soda water outside and in the snug comfort of the house, he completely forgot about it. That night was a particularly cold night with temperatures dropping much lower than it normally did, but little Frank slept through it all! The next morning, Frank opened the door to the front-porch and saw the glass of soda water he had left the previous night. He quickly rushed to retrieve the glass and drink up the leftover before anyone could scold him for being careless. To his surprise, the water was gone! It was replaced by a chunk of ice and the spoon he used for stirring was also stuck in it! ‘Hmmm… This is strange,’ said Frank as he explored the portable iceberg. He rattled it, shook it and held it upside-down, but the chunk of ice didn’t budge. Frank took the glass inside and remembered how his science teacher said that ice melts when it is in contact with heat. Thank his lucky stars for paying attention in class! He lit the cooking stove and using the spoon as a handle, gently held the glass over the flame at a safe distance. Slowly, the ice began to melt and he wrenched the spoon. There was a sound that went ‘pop’ and the frozen soda water slipped off from the glass with the spoon still attached to it. Frank was delighted and immediately started licking the frozen soda and took a bite. It was delicious! He was holding and eating soda water instead of drinking it. Lo and behold! The first ever popsicle was born! Many years later The story of Frank Epperson’s invention took a long pause. Little Frankie was too young to understand the importance of his invention. To an elevenyear-old boy, life was full of adventures and distractions and one afternoon of eating soda water was soon forgotten. So, how then did we come to enjoy popsicles if Frank forgot all about it?
Realtor is an agent who sells land, buildings, houses, etc. Fast forward to seventeen years later in 1922. Frank was now a twentyeight-year-old man with a job as a realtor for a Company in Oakland, California. According to popular accounts, Frank was invited to attend a small public event. He wanted to make a good impression and present something unique. Suddenly, his thoughts went back to the cold winter morning long forgotten and he remembered the taste of the frozen soda water. He decided to take a chance and prepare a flavoured drink. He poured it in a glass with sticks for handle and put the concoction into the freezer. He called it ‘Epsicle’, which is a portmanteau for Epperson and icicle. Frank’s Epsicle or ‘Frozen stick on ice’ was a runaway hit at the event. Its success inspired him to organize sampling trials. A year later, in 1924, he filed a patent for his invention. In 1924, after receiving the patent, he began selling his ‘drink on a stick’ in different flavours and peddled them at the Neptune Beach amusement park in California. His children came up with their own name for his product, which they called ‘Popsicle’. It was a moniker or nickname for Pop’s icicle.
A portmanteau is a combination of two words to form a new word. Frank liked the name; he changed the name of his product to ‘Popsicle’ and trademarked it so that others would not be able to use the name. To this day, Popsicle remains the name of the brand although the products are
referred to by several new ones. Nowadays, variants of popsicles are sold using names like ice-pop, ice-lolly, etc. They come in different flavours and are as popular as when they first made its debut in the market, if not more so. On the road to more success Unfortunately for Frank Epperson, he may have stumbled upon a great invention, however, he did not have much idea about marketing and so, he had to sell Popsicle to someone who did. The Joe Lowe Company bought Popsicle in 1925. The company began marketing Popsicle on a larger scale and it became a sensational hit! By 1939, the company promoted their product by introducing their mascot ‘Popsicle Pete’ as the winner of the ‘Typical American Boy Contest’ on a popular radio programme. Interestingly, Popsicle was also used by the Eighth Air Force Unit as a symbol of the American way of life that the soldiers were fighting to defend during the Second World War. Popsicles have come a long way, and more than a century later, they still represent the simpler pleasures of life.
Now, we know, sometimes all it takes is a curious mind and a stroke of luck! Let us never lose that sense of curiosity at the possibility of things that could happen.
32 Post-it Notes
Post-it notes are so convenient. Despite being a part of our essential stationery items, they are easily taken for granted. We use it to mark certain pages in books or make notes and stick them as reminders on the fridge, etc. We hardly think of how this little sticky memo came into being or who should be credited for inventing it. However, every little piece of invention, from the biggest to something so small and seemingly inconsequential like the Post-it notes, has the contribution of people behind it. Sometimes, the stories are really interesting too. Let’s find out whom do we owe our gratitude to, or why are Post-its originally yellow in colour. Theory one: Silver and Fry The first theory of invention of Post-it notes goes like this. In 1968, a scientist named Dr Spencer Silver was working at the Minnesota Mining and Manufacturing Company popularly known as 3M. They were trying to develop an adhesive much stronger than the ones produced. However, as luck would have it, Silver ended up with an adhesive, which could be reused!
Unlike the adhesive he had intended to make, this adhesive stuck lightly to the surface without bonding with it and could be detached easily. This was the first Post-it, according to 3M. For years, Silver tried to promote his invention within the company, even presenting his concept in seminars. However, nobody showed interest and called his invention a ‘solution without a problem’, suggesting that he had invented something that was not needed in the first place. But Silver didn’t give up! His luck began to turn when he encountered Arthur Fry, another scientist working for the same company. Around the same time when Silver had been trying to promote his invention, Fry had been struggling with a simple yet persistent problem. He was an active member of his church choir. Before the upcoming service, he would use scraps of paper to mark the songs he selected. To his dismay, by the time service started, the scraps of paper would have fallen off or get scattered! On one such occasion, Fry recalled a seminar he had attended a while back where a man demonstrated the use of adhesive paper. It struck him that the sticky paper would make a good bookmark since it would not fall off from between the leaves of his hymn book. Fry teamed up with Silver, and the two focused on rebranding their product and handing out samples among the company employees. It was a success and with positive feedback, 3M took notice, and the product was launched under the name ‘Press and Peel’ bookmarks. It was not an instant success as they might have hoped, and the company was determined to make the product a success. Boise blitz 3M decided to conduct operation Boise Blitz where they placed the product directly into the consumers’ hands in the town of Boise in Idaho. They handed out free samples to consumers to try and it proved to be a tremendous success. Colour me yellow The bright yellow colour of the original Post-it notes was as coincidental as the invention itself. When Silver and Fry were conducting the experiment, the laboratory next to him only had yellow scrap paper to spare. And so, the
invention came in a canary-yellow colour. Since its nationwide introduction on 6 April 1980, the invention has been produced in over 150 countries till date. The two inventors retired from 3M with the highest honours, and Silver was inducted into the National Inventors Hall of Fame in 2010.
It was always a self-advertising product because customers would put the notes on documents they sent to others, arousing the recipient’s curiosity. They would look at it, peel it off and play with it, and then go out and buy a pad for themselves. —Arthur Fry
Sticky lawsuit The story would have been a beautiful one of accidental inventions that turned out to be a success, had it not been for the lawsuit filed by inventor Alan Amron in 1997. Amron filed a suit against 3M, claiming that he had invented Post-it before 3M had launched the product. He claimed that it was him who had disclosed the concept to 3M in 1976. They reached an agreement and the matter was settled in 1998. In 2016, Amron filed another suit against the company for claiming that they had invented Post-it; however, the court dismissed the suit declaring the 1998 settlement to be upheld. Another 3M employee named Daniel Dassow voluntarily testified in 2016 that Amron’s claims were true, and he had disclosed the concept of his sticky memo to 3M. Theory two: Amron’s story We may never be able to ascertain whether it was Amron or Silver/Fry who came up with the invention, but let us try and learn both sides of the story.
According to Amron, he accidentally invented Post-it notes in 1973 in an attempt to leave a message to his wife. She had gone out and he wanted to tell her that he was going for a meeting. Amron took a slip of memo paper, which had no adhesive on either side, and he jotted down the message. He looked around for a Scotch tape to stick the memo on the refrigerator so that his wife wouldn’t miss the note but couldn’t find one. It was then that he saw a pack of gum lying on the counter as he searched for the Scotch tape. The sticky gum suddenly gave him a thought—since the gum is sticky maybe he could use it to stick the memo on the fridge! He popped one into his mouth and chewed on it till it became sticky. Then, he took a piece and stuck it onto the refrigerator and then stuck the memo on the gum. It held fast! Amron’s wife was very impressed! They discussed the possibility of developing it into a commercial product and so, Amron started experimenting with various adhesives and came up with the Post-it notes that adorns our study tables, notice boards and refrigerators. Post the stories on Post-its The advancement of technology has brought revolutionary changes in the way we communicate and pass on messages. Nowadays, we can send text messages, emails, direct messages on social media platforms, etc., and Postits are no longer a popular option to send messages. However, brainstorming sessions in offices, marking important pages of books, taking notes, etc., are still common practices where notepads are indispensable. Post-it notes also evolved with technology, and computers have virtual Post-it notes or desktop notes. It seems Post-its aren’t going anywhere, and it pops up in films, pop art and the daily lives of millions of people every day.
Evernote teamed up with Post-it to create an app for taking notes in a smartphone.
Look around you and notice how Post-its are being used!
33 Potato Chips
How much do we know about one of the most popular snacks in the world
—potato chips? Potato chips form a part of what is popularly known as comfort foods. They have become a necessary indulgence in this era of time constraints and work pressure. Whether one is watching television or entertaining guests, snacks are an essential part of these social interactions. These foods minimize the amount of time spent in preparation and provide a ready-to-eat source of contentment. This popular snack has also found its way into the hearts of the Indian household. It even has a very interesting story behind it. Whip up the chips George Crum (1824–1914), also known as George Speck, was born at Saratoga Lake, New York, in 1822. He was a chef of African-American and
Native-Indian origin. He was said to have accidentally stumbled upon this innovative recipe of serving fried potatoes while working at Moon’s Lake House.
It was said to have been introduced to the American public and popularized by Thomas Jefferson, one of the United State of America’s Founding Fathers. According to local legend, in the summer of 1853, on a particularly busy day at Moon’s, a customer ordered French-fried potatoes, a popular delicacy in the United States of America. The peevish customer kept sending the food back with the complaint that the potatoes were too thick and soft. Frustrated and at his wits’ end, Crum finally took a fresh batch of large potatoes, peeled and cut them into very thin slices and fried them till they turned golden brown. He then sprinkled salt on them. These fried potatoes were too thin and crispy to be skewered with a fork and Crum hoped it would teach the customer a lesson. However, Crum was surprised to find that the customer enjoyed the fried potatoes. The story spread and thus, began the popularity of ‘Saratoga Chips’ or ‘Potato Crunches’, which became a house delicacy at Moon’s. Tourists were advised to include the Lake House in their itinerary and sample a taste of the famous chips, which had become a local sensation. Potato flakes pioneers A form of what appears to be a recipe for potato chips could be found as early as 1822, in William Kitchiner’s The Cook’s Oracle, Mary Randolph’s Virginia House-Wife (1824) and N.K.M. Lee’s Cook’s Own Book (1832). However, the most widely accepted and credited inventor of the eponymous snack goes to George Crum of Saratoga Springs. The success of Saratoga
Chips launched Crum to greater heights, and by 1860, he was able to open his own restaurant called ‘Crum’s’ that had a basket of his special chips on every table. His restaurant closed down in 1890, and he passed away at the beginning of the First World War in 1914 at the age of ninety-two. The chain reaction Crum’s special ‘Saratoga Chips’ held great market potential. Since he had not patented his invention, many people started adopting his technique. They popularized it among American consumers. However, due to the inconvenience in packaging and storage, the chips were mainly served at restaurants even after many years of its invention. In 1895, William Tappendon began selling the chips in his grocery store and supplied them to his neighbourhood. He later converted his barn into a kitchen dedicated specifically to the production of potato chips. This was the first potato-chip factory in the United States. The potatoes used for making chips were peeled by hand, which made the task tedious and time-consuming, thereby, slowing the rate of production. However, almost seventy years after the invention of the chips, the mechanical potato peeler was invented in 1920. This was followed by the wax paper bag to preserve and prevent the chips from crumbling. This made the task much easier, quicker and enabled mass production of the food, which has now become a mainstay snack on the American menu.
In India, the potato-chips industry is dominated by Lay’s. Other popular brands include Uncle Chips, Pringles, Bingo and Haldiram’s. They come in classic salted, spicy tomato, cream and onion and masala flavours. Lay’s has different flavours ranging from—Spanish Tomato Tango, Classic Salted, Magic Masala, American Style Cream & Onion, etc.—to cater to Indian tastes.
In the beginning of the twentieth century, many companies had established their potato chips business, including Leominster Potato Chip Company, founded in 1908 and later changed to Tri-Sum Potato Chips, after a naming contest held by the company. However, none is as popular as Herman Lay, founder of Lay’s in 1932 in Nashville, Tennessee. At the beginning of his career, Herman Lay, a travelling salesman would peddle the potato chips from the trunk of his car and supply them to grocery stores in southern USA. Over the course of time, his empire grew and his name became synonymous with the crispy and salty snack Lay’s, which is now one of the most popular brands of potato chips in the world. A chip by any other name Its name also varies according to regional usage. Different variants of it has appeared over the course of time and are being modified and experimented with. Pringles made from dehydrated potatoes is one such example. It has further enabled experiments with several products such as bananas and other foods across the world. Although its most common name is chips, in countries like Ireland and the United Kingdom, it is referred to as ‘crisps’, where the ‘chips’ are similar to French fries. Joe ‘Spud’ Murphy, founder of Tayto, came up with the cheese and onion flavour in 1954. Nowadays, potato chips are modified into regional flavours to suit the varied culinary tastes of its consumers. In Japan, popular flavours include garlic, soy sauce and plum, while Indonesians prefer nori seaweed, sour cream, etc. Crum could not have predicted the extent to which his Saratoga Chips would impact the concept of snack among global consumers. Another interesting aspect to the invention is that it has further necessitated the invention of other things associated with it. The mechanical potato peeler, the wax paper bag, and various other devices and methods were invented to improve the product quality. The potato chips are a favourite among global consumers.
Why do chip packets have so much air? The air you find in the packets is Nitrogen. It helps keep the potato chips crisp and fresh!
Radioactivity is the phenomenon of radioactive decay or nuclear radiation.
Any element, which contains an unstable nucleus, is ‘radioactive’. Examples of radioactive elements are Uranium, Plutonium, Thorium, etc. In 1896, a French scientist named Henri Becquerel (1852–1908) discovered radioactivity. He had been working on Uranium salts in order to understand its phosphorescent quality (the light emitted after it is exposed to radiation). This interest was inspired in him by the discovery of X-rays by Röntgen in 1895. Becquerel was determined to find out the cause of the phosphorescent glow mentioned by Röntgen. The hypothesis Becquerel’s hypothesis was that the phosphorescence occurs due to the exposure to light. An example of this can be the moon lighting up due to the
light of the sun. He took some Uranium-rich salt crystals and placed them on a photographic plate. This plate had been wrapped in order to prevent exposure to light. He was hoping that the uranium crystals when exposed to light would emit rays, which would show up on the photographic plate. However, as luck would have it, on the day that he was conducting the experiment, the conditions were unfavourable and he took the whole setup and placed it in a dark drawer, hoping to complete his experiment another day.
An atom is the smallest particle of an element. It contains electrons. The central part of an atom is called a nucleus and has neutrons and protons. There has to be a balance of the two. Too many neutrons or protons upset this balance. The nucleus then becomes unstable. An unstable nucleus tries to achieve the balance by giving off neutron or proton. This is done via radioactive decay.
Image courtesy: Wikimedia Commons A few days later, Becquerel returned to his laboratory to resume his experiment. He opened the drawer to take out the photographic plate and uranium salts. What he saw before him marked the birth of nuclear physics.
The photographic plate showed distinct images of the uranium crystals even though there was no possibility of the crystals being exposed to light inside the dark drawer. Becquerel concluded that some kind of energy was spontaneously emitted by the uranium crystals, which left its mark on the photographic plate.
The SI (International System) unit for radioactivity ‘bq’ is named after Henri Becquerel. The Curies After his accidental discovery of radioactivity, Becquerel’s students, Marie and Pierre Curie, expanded on his finding and conducted more experiments to find out if other elements also exhibited the same property. They were successful in discovering that other elements with similar unstable nucleus also emitted the same radioactive rays. For their pioneering research, the Curies and Becquerel shared the Nobel Prize for Physics in 1903. Radiotherapy Five years after the discovery of radioactivity, Becquerel made another accidental discovery. In 1901, he had been working on the radioactive element, radium, and accidentally left a piece of it in his vest pocket. The radium had burnt his skin, and it led him to the conclusion that radioactive elements could, in due course of time, be used in medicine. Becquerel vaguely foresaw the role radiotherapy would play in modern medical science. Becquerel, the man Born in an affluent family of scientists in 1852, Antoine Henri Becquerel
trained and became an engineer in 1877. However, his primary interest was physics, and he attained a doctorate degree in 1888. Although Becquerel had discovered radioactivity, the extent to which these elements posed a threat to the human body was yet unknown. He could not have known that he had been constantly exposing himself to the most harmful radiations known to man. The daily exposure took a toll on his health, and a mere twelve years later, at the time of his death, there were severe burns on his skin. These burns were believed to be caused by his exposure to radioactive elements. Unfortunately, the health hazard of radioactivity was unknown then. Henri Becquerel, the father of nuclear physics, died on 25 August 1908, at the age of fifty-five.
Remember the Apollo 13 mission? Neil Armstrong and his friends hoisted the American flag on the moon? Well, according to scientists, the flag has turned white! Why? Because of the unfiltered radiation of ultraviolet rays on nylon used for making the flag.
35 Safety Glasses
Safety is an important aspect of any modern appliance or machinery. We
take precautions to ensure safety and eliminate the possibility of accidents. Automobiles are a necessity that makes travel more convenient. Therefore, it also follows certain safety standards with safety glass installed in the windows and airbags to break impact, etc. The manufacturer of an automobile goes through great precautions to ensure the safety of the driver and passengers. The automobile has to go through several tests before it reaches the showroom. However, this had not always been the case. With automobiles being only invented in the late nineteenth century, safety precautions for its user was not considered a priority. Fortunately, the mistake of one man led to an accidental invention, which was to make the world a safer place. Lazy luck In 1903, twenty-five-year-old French chemist and painter, Edouard Benedictus was working in his studio in Paris. As he climbed a ladder to fetch chemicals placed high up on a shelf, he accidentally knocked over an
empty flask made of glass. The flask fell to the floor and made a shattering sound. To his astonishment, Benedictus saw that the glass had not scattered into hundreds of broken shards but had rather clung together. The original shape of the flask had still been preserved. This made him curious. He questioned his assistant about the flask and learned that the flask had contained a solution of cellulose nitrate (liquid plastic). The solution had evaporated, leaving behind a thin coat of plastic on the interior of the flask. Assuming the flask to be clean, the assistant had been a little negligent with his duty. He had not washed the flask and had placed it on the shelf. The plastic coating stuck onto the inner wall of the flask had held the pieces of glass together! Triplex Benedictus continued to experiment with plastic-coated glass for the next twenty-four hours and by afternoon on the next day, voila! He had in his hands the first ever sample of ‘Triplex’, safety glass! He continued working on his invention, using a sheet of cellulose to bond and laminate two plates of glass together. On testing, he found that at the moment of impact, instead of shattering, the cellulose held the shattered shard of glasses together to form a weblike pattern. However, it would take a few more years for his invention to be put to practical use or for people to realize the value of his contribution.
Benedictus was granted a patent for his invention in 1909. Making the world a safer place (accidentally!) Cars produced at the beginning of the twentieth century had glass windows, which would shatter upon impact and cut its occupants, often causing fatal injuries. Benedictus saw the potential of his Triplex and tried to sell his
invention to automakers. Unfortunately, manufacturers were not interested. Automobiles were an expensive luxury, and automakers were already trying to cut down the cost of production in order to increase sales. Installing expensive safety glasses would mean a sharp rise in the product prices, hence a drop in sales. An unlikely market opened up for Benedictus during the First World War (1914–18). His laminated glass was used to make goggles for gas masks as a much needed protection against poison gas made of chlorine used by the Germans. The glass goggles were small, oval-shaped pieces of laminated glass, which was inexpensive and easy to manufacture. A factor, which led to the incorporation of safety glass by automobile manufacturers, was the rise in number of complaints against manufacturers in cases of accidents. As more people began to own and drive vehicles, it resulted in more accidents. Shattered window glasses caused more injuries than any other parts of the automobile. They began to see the wisdom of investing in safety glasses. In 1919, Henry Ford (1863–1947), founder of the Ford Motor Company, started using laminated glass in the automobiles he manufactured and over the decade, he made it a standard practice for all subsequent Ford productions.
In 1912, Reginald Delpech obtained the licence for the Triplex Safety Glass Company Ltd. in England under French patents, and it made its way into the United States. A disadvantage of laminated glass was that the inner layer made of celluloid tended to discolour and grew brittle, which could be easily punctured. Canadian chemists Frederick W. Skirrow and Howard W. Matheson invented PVB (polyvinyl butyral) in 1927, which blocked ultraviolet rays and made the glass clearer and more durable. Ford was a pioneer in advertising the use of safety glass and other manufacturers began installing laminated glass in their cars with the Pittsburgh Safety Glass. It was
the leading manufacturer in the US, which produced an economical version of the glass called, Duplate. Herculite, a tempered glass that was much tougher than laminated plate glass, was developed, and by the late 1950s, safety glass was installed in all cars except on the rear windows. The invention of laminated glass may have been an accident; however, the extent to which mankind has explored its potential makes it a commercial success. Its use is no longer confined to windscreens alone. Architects began to incorporate safety glass in doors and panels, mirrors, windows, etc., for several reasons, ranging from safety in shops and bank windows to sound reduction, durability and protection against hurricanes, hail, and other natural disasters. More than a century after its invention, the safety glass of Edouard Benedictus would go on to change the architectural style and aesthetics of modern buildings.
Toughened glass is a type of safety glass which is used for walls and windows, as they are hard to break. In case of breakage, it shatters into small, blunt pieces and are not injurious.
Foods made of bread, cheese, meat, etc., have been eaten together or in
different forms separately as side dishes. One of the ways of having them together is the very popular sandwich. Fast food The invention of the sandwich is attributed to John Montagu, the 4th Earl of Sandwich after whom the snack is named. John Montagu (1718–92) was a prominent member of the royal court. He was an avid gambler. Around 1762, he was once on a winning streak while gambling. He was so engrossed into his game that he did not want to get up and eat. He told the cook to prepare a food for him that would not interfere with gambling. He was surprised to see that the cook had prepared something, which not only tasted good, but was easy to consume! He had put cooked meat in between two slices of bread. That way, it was not messy while eating. No fork or spoon was needed and he could eat while playing! Sure enough, many members of the court soon began to ask for the same
kind of food. Lo and behold! The sandwich was invented! In the 1760s, literary references of the sandwich can be found mainly as food consumed by men, to be eaten at drinking parties until the food became a general item on the eighteenth-century menu of the non-aristocratic class. In the nineteenth century, there was a growing demand for supper, which is convenient to carry for picnics and yet delicious enough not to dampen the celebratory spirit. Sandwich became an ideal supplement, and it began to wind its way into the regular diet of the British gentry and peasants alike. On 24 November 1762, the first written record of the word ‘sandwich’ appeared in Edward Gibbons’s journal. The sandwich was introduced to the American public by Elizabeth Leslie (1787–1858), an Englishwoman in her cookbook titled, Directions for Cookery in Its Various Branches. She describes the recipe for a ham sandwich as: Ham Sandwiches—Cut some thin slices of bread very neatly, having slightly buttered them; and if you choose, spread on a very little mustard. Have ready some very thin slices of cold boiled ham and lay one between two slices of bread. You may either roll them up or lay them flat on the plates. They are used at supper or at luncheon. Elizabeth Leslie*
There are many varieties of sandwiches available, some of which are the famous peanut butter and jelly sandwich, hamburger, submarine, etc. Before being known as sandwiches, it was known simply as ‘bread and meat’. A court in Boston passed a ruling that a sandwich must have at least two slices of bread and excluded other foods like burrito, tacos and quesadilla from being considered a sandwich. Sometimes addressed as England’s greatest contribution to gastronomy, the sandwich continues to be the go-to food for a quick yet satisfying meal.
In 1890, Marks and Spencer sold the first pre-made packaged sandwich. It was a huge success.
DIY: Follow Elizabeth Leslie’s recipe and make your own sandwich! You can replace the ham with paneer!
*Elizabeth Leslie, Directions for Cookery in https://archive.org/details/misslesliescompl00lesl/page/206
37 Sanitary Pad
is a biological process in which blood and tissue are discharged. Its duration varies from two to seven days, and it is informally called ‘periods’. Although it is a natural process, it is one of the least discussed and tabooed subjects. The period myth In developing countries like India, there are many misconceptions regarding menstruation, which are often detrimental for the well-being and dignity of women. A woman is considered unclean during menstruation. She is not even allowed to access certain places. There is an interesting story about a female Greek mathematician named Hypatia who was said to have thrown her menstrual rag at a relentless suitor to discourage him. In the early ages, rags, sand, grass and other absorbent
materials were used by women. One can only imagine the plight and discomfort women had to endure. Disposable sanitary pads are a fairly recent phenomenon. Conception The concept of disposable sanitary pads used today had an unlikely beginning. Benjamin Franklin, one of the Founding Fathers of the United States of America, was believed to have come up with the idea for blood absorbents in the eighteenth century. He was trying to devise ways to stop the bleeding of wounded soldiers to prevent death due to blood loss.
Cellu-cotton is a fibrous material made from wood pulp, which is used as a substitute for cotton because of its absorbent property. During the First World War, the Kimberly-Clark Corporation in Wisconsin produced cellu-cotton, processed from wood pulp, as a substitute for cotton to be used by soldiers. Nurses found these bandages to be effective as menstrual pads and also inexpensive enough to be disposed after use. This convenience was a welcome change for the exhausted nurses working in a hazardous environment. After the end of the war, Kimberly-Clark continued manufacturing cellu-cotton packaged as sanitary pads and launched Kotex. Nonsense and sensibility In the beginning, the sales was slow as the customers were embarrassed to be seen buying the product. The manufacturers found a clever way of retailing their product by placing boxes of napkins in a strategic position alongside a box for payment. This self-service mode eliminated the discomfort of dealing with the sales clerk and encouraged more women to try the new method of
tackling an age-old problem. The shape and design of the pads also evolved as their demands increased. The earlier designs were simple rectangular layers of cellu-cotton covered with an absorbent fabric, which was tied at both ends and fastened to a belt worn at the waist. These were not convenient as it restricted movement on certain days. A later design introduced pads with a strip of adhesive attached to the bottom of the layer which could be stuck to the underwear. This is still the design that is used today, along with ‘wings’, or lapels, on either sides to hold the napkin in place along with avoiding leakage. It comes in varying sizes and there is a plethora of brands to choose from. Pink tax ‘Pink tax’ is a hypothesis, which believes that retail industries discriminate products targeted at women by making them more expensive. Sanitary pads are a necessity for women. However, these products have been sold at a price, which was too expensive for women belonging to lower income groups. Hence, women in less developed countries still resort to unhygienic rags and other hazardous ancient methods. This is where Arunachalam Muruganatham of Coimbatore, Tamil Nadu, stepped in. The reticent Indian was given the moniker, Pad Man. He invented low-cost machines that made cost-effective sanitary pads. Pad Man The story behind Muruganatham’s invention is a deeply personal one. He was concerned about his wife, who used rags and old newspapers as the sanitary pads sold in drugstores were too expensive. His obsession with sanitary pads made him an outcast in his village as menstruation was still a taboo in most rural villages of India. Nevertheless, determined to make lowcost sanitary pads, he found a way to build a machine, which could make an effective pad with a low production cost. This made it possible for the pads to be sold at a much cheaper price, further making it affordable for even the financially weaker section of the society. Not only did he improve the quality of women’s hygiene but also empowered them through employment. His contribution is seen as a crucial step in changing the life of women in India.
End of sentence The inspiring story of Muruganatham has been shown through films and documentaries—Menstrual Man by Amit Virmani in 2013, Phullu in 2017, Pad Man starring Akshay Kumar in 2018. In 2018, documentary film, Period. End of Sentence. won the Best Documentary (Short Subject) at the ninety-first Oscar Awards. In the twentieth century, women will lead a more wholesome life once something as natural like having periods stops being a punishment for them!
India is still trapped in the many superstitions related to menstruation and the female body in general. It is our task to educate the public and break the myths surrounding this very natural biological process.
Silk is a fibre that we can get from the cocoon of mulberry silkworms. The
larva produces the fibre from its mouth and wounds it over its body to form a cocoon. It is a precious fibre used in making clothing with a shimmery quality. Silk is expensive because of the lengthy process of rearing the silkworm, which only feeds on the mulberry bush. Lady of the silkworm The legend of how silk was accidentally invented began in China around 2640 BC during the reign of the Yellow Emperor, Huangdi. The story goes like this. One day, the Empress, Si-Ling Chi, was drinking tea in the garden
under the shade of the mulberry tree. Suddenly, a cocoon, containing the silkworm larva attached to the mulberry leaves, fell into the cup of tea the empress had been drinking. She tried to take the cocoon out but it was immersed in the hot tea. When she tried to hold it, she noted that the cocoon was soft and came apart in her hand. It left a strand of thread-like cover in her hands. She began pulling the thread and it kept unravelling. The empress was astonished to find herself holding a long continuous string of silk yarn. The intelligent emperor immediately ordered that the yarn be spun and weaved. The cloth produced was beautiful! And it seemed as if the fabric was absorbing and reflecting different hues of light! China dominated the silk trade. Weaving silk was strictly confined to the empire and anyone smuggling the eggs of the silkworms or seeds of the mulberry plant were subjected to the harshest punishment. China maintained monopoly of the silk trade until the sixth century AD. The famous Silk Route that opened between China and Rome was so named because of the valuable silk fabric that was being traded. It was said that under Byzantine Emperor Justinian in around 550 AD, two missionaries who had gone to China successfully smuggled the eggs of the silkworms and the mulberry seeds inside their hollow staffs. This gave the Byzantine Empire a monopoly of the silk trade in Europe. Once the secret had reached Europe, it was only a matter of time until several countries began to learn the trade. By the fourteenth century, the major cities of Europe had a flourishing silk industry although the long process involved in culturing the silkworm and the finished product still made it an essentially expensive commodity.
Sericulture: the practice of farming silkworms on a large scale in order to harvest the cocoon to extract raw silk. As soft as silk
How many cocoons do you think are needed to make one blouse? Approximately more than 600 cocoons are needed to make a single blouse. No wonder silk is so expensive! Although there are several accounts of how silk was first invented, the Empress Si-Ling Chi and the story of how she accidentally invented one of the most expensive and luxurious fabric remains the most popular and widely accepted account to this day.
An average of twenty-five to twenty-eight days are required for the silkworm to mature and spin a cocoon. These cocoons are then plucked from the mulberry leaves and placed on straws to which they attach themselves. After the cocoons are completely woven around the worm, the process of extracting silk begins. In the process of silk manufacturing, there are various layers involved.
The Silk Road is a name given to a network of routes connecting East Asia and Southeast Asia with South Asia, Persia, the Arabian Peninsula, East Africa and Southern Europe. It was the longest trade route by land in the ancient world.
39 Stainless Steel
If you take a look around in your kitchen, you will find many items that are
made of stainless steel—spoons, pots, knives, etc. They are such an integral part of our cutlery and cookware that we hardly think twice about how they came into being. Stainless steel is the preferred material for cutlery because of its non-corrosive (not damaged easily), lustre and lightweight properties. What’s in a steel? Stainless steel is an alloy or mixture of chromium and carbon. When the element Molybdenum (Mo) is added to the alloy, it increases the corrosive resistance of the steel and makes it more practical than iron or bronze. It is also an essential component in the construction of buildings because it does not corrode easily in the face of the natural elements of heat and rain. As is the case with most inventions where there is no clear evidence of the
inventor, many claimants arise. Stainless steel is also a victim of claims and counter-claims. We shall learn about the man who is widely considered to be the inventor.
Brearley’s invention of stainless steel was announced by the New York Times in 1915. Man of steel The credit for inventing steel is most commonly given to Harry Brearley (1871–1948). Born in Sheffield, England, Brearley worked at the Brown Firth Laboratories. In 1912, he was entrusted with the task of finding a solution for a gun manufacturer. The barrel of the guns kept eroding, and he wanted Brearley to help fix the problem. Brearley immediately started experimenting to find an erosion–resistant steel alloy. He ended up with a corrosion resistant one! On 13 August 1913, he created a steel alloy with 0.24 per cent carbon and 12.8 per cent chromium. This was said to be the first ever stainless steel made. Whether he had come up with the proportion through his calculations or if it was a stroke of pure luck—that cannot be determined. Popular lore describes how Brearley tossed the alloy in the garbage bin in disappointment. Later, he found that it did not rust as quickly as the other metals in the bin. He went on to test the steel with several acidic substances and found that it had still not corroded. Brearley felt that his invention could have great potential in cutlery manufacturing. He wanted to produce cutlery made of stainless steel. The knives had to be hardened so that they would be strong and durable enough for daily use, but Brearley was disappointed by the lack of support of his employers. However, the cutlery manager, R.F. Mosley, at Mosley’s Portland Works, came to his rescue. Brearley worked together with this local cutler, who helped him produce hardened knives which were ideal
for cutting and chopping. Nowadays, stainless steel is widely used for most of our cutlery because of its features like rust-resistance and affordability.
Steel challenge: Make a list of steel items in your kitchen!
Cyanoacrylates are commonly known by its commercial name ‘superglue’.
They are made up of esters (a chemical compound) of cyanoacrylic acid. Today, this superglue has become a necessity in arts and craft, medicine, industrial works and in daily use for mending broken items. The credit goes to Dr Harry Wesley Coover Jr. (1917–2011). Born in Newark, on 6 March 1917, he earned his PhD from Cornell University in 1944. In 1942, Coover and a team of scientists discovered a sticky substance that stuck fast to any material it came in contact with. However, the patent for cyanoacrylate was filed by the Goodrich Company in that same year while researching materials for gun sights to be used in the Second World War. Since the invention was not considered beneficial for the war effort, its potential was not explored
further. Eastman 910 In 1951, Dr Coover rediscovered the commercial potential of cyanoacrylates. He was working for Eastman Kodak, an American technology company, where he oversaw the work of a team researching heat-resistant polymers. Coover realized a unique property of cyanoacrylates—it did not require any heat or pressure to bond and easily polymerized when in contact with moisture on the surface of objects. Further tests revealed that the objects bonded due to the sticky cyanoacrylates permanently. Coover filed a patent for his invention, and in 1958, Eastman Kodak packaged and marketed the product under the name Eastman 910. Superglue was a commercial success, both for the product and for Dr Coover. He even made a television appearance in the popular 1950s panel game show ‘I’ve Got a Secret’. In the show, Coover gave a live demonstration of his superglue by bonding two metal bars together. One bar was able to support the other bar through the sheer strength of the adhesive that held it together. The show was a success and ensured the popularity of Coover’s ‘superglue’.
Gun sight is a device attached to a gun, used for aligning and focusing on a distant target. What’s in a name? Eastman Kodak sold the production rights of cyanoacrylate to Loctite, a German-owned American adhesive company. Loctite sold the product under a new name—Loctite Quick Set 404. Although subsequent manufacturers developed different formulae in producing superglue, the most common formula continues to be the original 910, accidentally stumbled upon by Dr
Harry Coover and Fred Joyner.
Superglue was initially sold as Eastman 910 with the numerical 910 because it was the 910th compound researcher Fred Joyner tested on two lenses. The lenses being stuck permanently, caused a loss of almost $3000 worth of laboratory equipment. Medical adhesive Coover’s dream of contributing to the war effort finally came true in 1966 when a cyanoacrylate spray was used as a medical adhesive or coagulant for temporary treatment for wounded soldiers during the Vietnam War (1955– 75). However, due to its minor toxicity, the Food and Drug Administration of the USA did not authorize its use until a safer alternative was introduced.
Polymerization is a process where small molecules called monomers combine to form large molecules of different shapes and sizes. Molecules built through this process are called polymers. A life well lived Dr Coover’s career as an inventor did not stop with ‘superglue’. Over the course of his long career, Coover contributed towards the advancement of
science and was awarded more than 460 patents. He also served as vice president of Eastman Kodak. He also received several accolades and was elected into the National Academy of Engineering in 1983. He was inducted into the National Inventor’s Hall of Fame in 2004. In 2009, Coover, at the age of ninety-two, was awarded the National Medal of ‘Technology and Innovation, by President Barack Obama. It is the highest honour that can be conferred to a citizen by the United States of America in the field of technology. Two years later, at the age of ninety-four, Coover died of natural causes.
Inductees to the Hall of Fame include inventors like Alexander Graham Bell, John Boyd Dunlop and Charles Goodyear, etc. Dr Harry Wesley Coover’s failed attempt to enhance war weaponry resulted in the invention of a medicinal adhesive that could save many lives in the subsequent years.
Accidents happen. Sometimes, you might end up getting superglue on your skin. What can you do to remove it? Applying nail-polish remover or acetone is one quick way of getting rid of it!
41 Synthetic Dyes
Colours add beauty to our life, and we often buy things in different colours
according to our preference. Until the discovery of synthetic dye, colours were obtained from natural sources. They were quite expensive to extract and the colours would bleed after the cloth was washed. However, the accidental invention of synthetic dye by Sir William Henry Perkin (1838–1907) added colours to our fabric. William Henry Perkin was born in London on 12 March 1838. He was educated at the City of London School, and at fifteen years of age, he was enrolled in the Royal College of Chemistry. At the college, Perkin studied under the renowned August Wilhelm von Hofmann. Hofmann was working on quinine, an antimalarial drug, with Perkin as an assistant. Operation secret garden
It was a moment of pure accident that launched the course of Perkin’s destiny. In 1856, he had been working on an experiment to develop quinine, which was in great demand. Quinine was used to treat malaria. At around this time, he stumbled upon his greatest discovery. He found that aniline, when extracted (separation of different compounds) with alcohol, produces a bright purple-coloured substance. Perkin was delighted with the outcome. He asked his friend, Arthur, and his brother, Thomas Church, to assist him in finding out more about the substance.
The purple colour Perkin invented was precious because the colour purple was a mark of royalty and aristocracy. The tyrian purple dye was extracted from the glandular mucous of a specific type of mollusc. This made the colour expensive and very hard to come by. However, Perkin had come up with a cheaper solution. Purple was no longer an exclusive colour! However, Perkin was employed by Hofmann, and so, conducting an experiment other than the one assigned to him was a problem. So, the trio secretly worked in the garden of his Cable Street home, hoping that the invention would have commercial value. The purple substance was named Mauveine. The trio dyed a silk cloth in the solution and found that the colour held fast to the fabric even after washing and exposure to light! In the midnineteenth century, England was a developing industrial town. The textile market was booming. Moreover, the coal tar, a byproduct of coal that Perkin used as raw material, was easy to obtain. In August 1856, he filed a patent for his invention. He persuaded his father and brothers to invest in setting up a manufacturing plant for his product and his business venture was commercially successful. He went on to discover more synthetic dyes. A trendsetter
The contribution of Perkin lies not only in his invention but in the way he has changed the fashion of that century. The cost of dyeing clothes was cut down considerably. The prices of coloured fabrics decreased. Middle-class people could now afford colour, which used to be worn only by the rich upper classes. Therefore, his accidental invention proved not only a commercial success but also broke down the social barriers that existed in the world of fashion and clothing! William Henry Perkin went on to contribute a great deal to Chemistry. In 1866, he became a member of the Royal Society. In 1907, the man who made the world a bit more colourful died from pneumonia and is buried at Harrow in Middlesex.
He was knighted by Queen Victoria and became Sir William Henry Perkin. He was also awarded the Royal Medal and the Davy Medal for his contribution to science.
Perkin was only eighteen years old when he invented Mauveine!
Tea is a beverage made from the leaves of the tea plant. It is brewed in
water and can be drunk plain or with sugar or milk. India is the second largest producer of tea after China and consumes around 70 per cent of its own produce. Accounts of tea being served as beverage have been recorded as early as the seventeenth century and natives like the Singpho people inhabiting parts of Arunachal Pradesh and Assam have always had tea as a part of their tradition. It was even before the British East India Company gained control of the region after the Treaty of Yandaboo. Tea trails It is a popular belief that the optimum taste of tea can be obtained by brewing the tea leaves in hot water. However, there was the inconvenience of straining the leaves after preparation. Perforated metal containers attached to a chain were used to store the tea leaves and then dipped in boiling water and
removed using the chain. Similar models of tea infusers were used until the teabag was accidentally invented at the beginning of the twentieth century. Imported products like tea, sugar, spices, etc., were transported in crates. They were then weighed and sold in loose packets of paper or cloth sewn together. Industrial advancements led to innovations in the packaging of products sold in markets. This was done not only for reasons of storage and preservation but also to make the product more attractive and convenient for use by the consumer.
The Treaty of Yandaboo was signed on 24 February 1826 between the British East India Company and the Kingdom of Ava (Burma). Burma (now Myanmar) ceded the regions of present Assam, Manipur, Cachar, Jaintia Hills, etc., to the British East India Company. Sullivan’s silk In 1908, Thomas Sullivan, an American tea and coffee merchant had been sending out samples of tea to his clients. He wrapped the samples in small bags of Chinese silk tied with a drawstring. He learned that some clients dipped the bag directly into hot water. It saved them the cumbersome task of opening the bags and measuring out the loose tea. The idea struck Sullivan as a unique way of selling tea. It was more convenient than the lengthy process of boiling water, adding the tea and then filtering the tea leaves. Tea could be prepared in a cup! Sullivan immediately used the method to his advantage. He replaced the expensive silken bags with a cheaper gauge bag. These small sachets of tea became a favourite among American clients who preferred the neat little sacks of tea as much as the convenience with which tea could now be made.
German engineer, Adolf Rambold, invented a machine that could pack Teabags instead of packing each bag manually. In 1903, Roberta C. Lawson and Mary McLaren of Wisconsin had already filed a patent for an invention they called ‘tea leaf holder’. It was made of a mesh fabric stitched to hold the leaves while allowing the water to circulate and infuse the tea. However, it was Sullivan who turned the concept of teabags into a commercial endeavour. The first teabags were shaped more like little pouches tied at the end with a thread. The square paper bags and rectangular teabags, commonly used today, were invented in the 1940s. The amount and quality of tea has remained more or less consistent while the shape of the bags has gone through several modifications over the course of time. Chai pe charcha Indians love their tea although teabags are yet to be adopted by the masses. Even in the blustering heat of Indian summers, we seldom refuse a hot cup of chai. Offering tea has become a symbolic gesture of hospitality and a sign of goodwill. Even Prime Minister Narendra Modi used the iconic symbolism of tea in his ‘Chai pe Charcha with NaMo’ (conversations over tea with NaMo) campaign during the 2014 election. Tea is grown locally in the state of Assam and Darjeeling in West Bengal in the Himalayan mountain range. The sprawling tea plantations in these regions enhance the landscape and have also become popular tourist destinations.
Assam Standard Time! An interesting feature of the tea gardens in Assam is that they follow Bagantime or ‘Tea Garden Time’. It is an hour ahead of the Indian Standard Time because the region experiences early sunrise. Tea revolution Tea also plays an important role in world history. The story of the Boston Tea Party, which escalated into the American War of Independence against England has to be mentioned. On 16 December 1773, protesters calling themselves the ‘Sons of Liberty’ boarded ships of the British East India Company. The ships were laden with crates of tea on the Boston harbour. They threw the crates into the sea as an act of protest against the Townshend Acts. Besides sugar, tea was a major source of the British economy. The protesters of Boston Tea Party used tea as a symbolic protest against British colonialism, and 135 years later, an American changed the way the world, including the British, made their tea.
The Townshend Acts is a series of laws that taxed the American colonies separately for goods like tea, paper, glass, etc., that were not produced in the continent.
Da Hong Pao grown in the Wuyi mountains of China is the most expensive tea in the world. It costs around $1.2 million per kilogram! This clearly is not everyone’s cup of tea!
is also known as PTFE (polytetrafluoroethylene). It is a thermoplastic fluoropolymer (a polymer containing carbon-fluorine) of TFE or tetrafluoroethylene. PTFE is waterproof, slippery, dust-proof and has a high melting point, which makes it ideal for making non-stick utensils. One fine day… PTFE was invented by Roy J. Plunkett (1910–94) on 6 April 1938. He had been working in New Jersey. Plunkett and his assistant Jack Rebok had been attempting to make CFC (chlorofluorocarbon) to be used as a refrigerant when the TFE gas stopped flowing. This would not have been so unusual if the bottle’s weight had indicated that the bottle was empty. However, the indicator remained above the empty point and yet, the bottle was weighing much less than it should have. It intrigued Plunkett so much that he decided to cut the bottle open to see what was happening inside. More surprises awaited him as he observed the inside wall of the bottle coated with a white,
waxy substance that was very slippery to touch. He immediately examined the substance and found it to be polymerized tetrafluoroethylene.
DuPont filed a patent for Teflon in 1948 under Kinetic Chemicals and was producing more than 2 million pounds of PTFE in a year. In a non-sticky situation The most enduring use of PTFE is in the kitchen, where pots and pans are coated with Teflon to prevent foods sticking to the surface of the pans during preparation. The world owes credit to Colette Grégoire and her husband Marc Grégoire for the innovative idea of Teflon-coated utensils. In 1954, Colette urged her husband to use the material on their aluminium cooking pans. Marc Grégoire complied and the first PTFE-coated pans were invented under the brand name ‘Tefal’. It combined the ‘Tef’ from Teflon and ‘al’ from aluminium—a material in which most of the cookware was made. The PTFEcoated pans in the United States, known as the ‘Happy Pan’, was made with the motto: ‘nothing sticks to Happy Pan’. Who was Roy J. Plunkett? Roy Joseph Plunkett was born on 26 June 1910 in New Carlisle, Ohio, and received his PhD from Ohio State University in 1936. For his invention, Plunkett was awarded the John Scott Medal by the city of Philadelphia in 1951, and in 1985, he was inducted into the National Inventors Hall of Fame. He passed away from cancer at the age of eighty-three on 12 May 1994 at his home in Texas.
Cleaning tip: Always remember to use scratch-proof scouring pads for cleaning non-stick or teflon coated pans.
44 The Wheel
The wheels on the bus goes round and round all day long!
The popular nursery rhyme gets the laws of motion right. The wheel will
continue in the set motion unless an external force is applied to stop it. The wheel is a circular block, which is made of durable material with a hole at the centre where the axle bearing is placed. The wheel rotates about the axle and movement is achieved. Imagine if the wheel had not been invented. Many of the day-today objects that ease our work would not exist. The wheel had not always been there and many millennia were spent without it, even after man invented agriculture. Ancient wheel The wheel was believed to have been invented during the late Neolithic age or the New Stone Age. There is evidence of the oldest surviving wheel, dating back to around 3500 BC, which was found in the city of Ur,
Mesopotamia, (modern-day Iraq). Clay-tablet pictographs were also found there, dating back to 3699 BC. Many other such archaeological findings are evidence that our forefathers invented and made use of the wheel. It is true that the material and form would be a rudimentary one compared to the kinds that are in use in modern times.
The word ‘wheel’ is derived from Old English ‘hweol’ or ‘hweogol’, meaning ‘to revolve’ or ‘move round’. Potter’s wheel The wheel excavated in Ur indicates that it was meant to be used in pottery. There is no precise record of when the potter’s wheel was modified into the wheel and axle. However, the concept of round objects that can be moved when rolled was already in existence, for instance, a round boulder would move down the hill. The invention of the wheel would not have been possible if there was no knowledge of the rotational motions of the potter’s wheel. It was a very important moment in the history of technological invention when our ancestors looked at the round potter’s wheel and realized that they could use it as a transportation device! It set the wheel in motion for the technological advancements that we have achieved today. The invention of the wheel made it possible to adapt it for various tasks like the water wheel, spinning wheel, potter’s wheel, etc. Although there is no clear account of how the wheel came to exist, we, therefore, must depend on the evidence found through archaeological excavation. However, it is safe to assume that scientific thinking and innovation in those periods would have been extremely limited if it existed at all. So, any discovery or invention would have been a matter of accident or luck.
The oldest wooden wheel was excavated in Ljubljana, Slovenia, dating to 3200 BC. The loophole In the modern era, if credit for the invention is to be given to an individual, John Keogh of Australia would claim the honour. However, he did not actually invent the wheel but was granted a patent for a ‘circular transportation facilitation device’. He challenged the online patent system, which does not require a lawyer. He had found a loophole in the system which was introduced in 2001. A year later, to prove his point, Keogh filed an application and was granted the patent! The wagon-wheel effect In movies, when cameras capture the motion of a wheel, the wheel, which is continuously moving, breaks into a fragment of frames by the camera. Therefore, in one frame, the wheel might be at the position of 12 o’clock and in the next scene, the camera might capture the moment when the wheel is at 10 o’clock. This creates the illusion that the wheel is moving backwards. This effect is known as the wagon-wheel effect. Roulette wheel Another accidental invention that resulted because of the invention of the wheel was the ‘Roulette wheel’ by Blaise Pascal, a French mathematician from the seventeenth century. Pascal was trying to build a perpetual motion machine which would keep moving once started. Instead, he ended up with a wheel which came to be known as ‘Roulette’ or ‘little wheel’ and is an early prototype of the Roulette used in casinos.
In the Middle Ages, the wheel became a metaphor for punishment. As a form of punishment, a person would be tied to the wheel’s face and the wheel would be set in motion thereby crushing the person. Legend has it that Saint Catherine of Alexandria was tied to the front of a wheel. However, the wheel broke and it was taken as a sign from God and she came to be known as St. Catherine, the patron saint of wheels. The wheel came relatively late in the list of man’s early inventions, and is predated by inventions like sewing needles, boats, basket weaving, etc. However, once it came into existence, it brought about the possibility of improving most of the inventions that came before and after it.
Pencils, sharpeners, door-knobs, clocks, cars, etc., are some of the ordinary day-to-day things that has a wheel. Humans are the only species that use this device!
45 Thermal Inkjet Printer
It has become so easy to get copies of notes, books, pictures, computer data,
etc. Gone are the days when students would copy notes and write it by hand. Nowadays, one can just scan and print any number of pages. This is timeconserving, and it saves energy. The history of modern printing began with Johannes Gutenberg while computer printing is a twentieth-century phenomenon. It began in 1938 when Chester Carlson developed a dry printing process, known as electrophotography. Electrophotography is also used in photocopiers, fax-machines, etc. Get, set, print! An inkjet printer is a device used for recreating digital images through minute jets of ink sprayed onto the paper, plastic or other substances. The concept of inkjet printing is a twentieth-century concept with the technology extensively explored and developed in the 1950s. The first commercial inkjet printer was
invented in 1976. However, it wasn’t until 1988 that it became commercially available when the deskjet inkjet printer was launched by Hewlett-Packard’s (HP). Inkjet printers were mainly produced by leading brands like Epson, HP, Canon and Brother. Currently, they are the most common and convenient computer printing devices, used extensively in offices and schools and in private homes. Printhead The printhead is the heart of the inkjet printer. It determines the speed and quality of image that the printer produces. The printhead is primarily of three types—piezoelectric, continuous inkjet and thermal inkjet. The concept for the piezoelectric printhead was developed through the principle of piezoelectricity (go to Chapter 8). The piezoelectric material changes shape when voltage is applied, and it squeezes the fluid or ink and forces out droplets from the print nozzle. This type of printhead is also known as Drop on Demand (DoD). The earliest form of the continuous inkjet (CIJ) was invented by Lord Kelvin in 1867. He built a recorder, which could record telegraphic signals and imprint them as a continuous trace on paper using an inkjet. However, it would be more than a century later for the commercial device to be developed in 1951. Thermal inkjet printers have a series of minute chambers, which are electrically heated. These chambers are constructed using photolithography. An electric pulse is passed through the chambers causing the chambers to form bubbles. These bubbles propel ink onto the paper in a jet-like spurt. It was accidentally developed by Ishiro Endo in 1977. Accidental Endo Ichiro Endo was a Canon engineer who accidentally stumbled upon the idea of thermal printhead in 1977. In a moment of distraction, Endo unknowingly rested a hot iron on his pen! The heat from the iron caused the ink to escape from the nozzle of the pen in a quick jet-like spurt. It caught the interest of Endo, and he conducted subsequent experiments to prove that his thermal principle worked on the printhead. It turned out to be a success, and he filed for a patent for his invention. Most of the modern printers like HP, Canon, Lexmark, etc., use Endo’s printheads made using Endo’s thermal principle. It
pays to be a little distracted sometimes!
Ichiro Endo was granted the patent on 19 May 1998. It served as the model for modern inkjet printers.
Next time you go to the local photocopier, you could ask for more details about the workings of the printer!
Tofu is a food prepared by coagulating (change to a solid or semi-solid
state) soy milk. It is also called ‘bean curd’ because of its ingredient— soybeans. Tofu is prepared by first coagulating the soy milk and then pressing the curd, which is formed by coagulation till it becomes a solid block. It is a very important part of the east Asian and southeast Asian diet. Tofu in the east The first instance of consumption can be traced back to more than 2,000 years in the Han dynasty in China. The tofu block can be extra firm, soft or silken. It is added to other dishes as it absorbs the flavour of other ingredients. Its taste changes with the components it is prepared with. It was said that Liu An fed the soy milk to his ailing mother who loved soybeans but was too old and weak to chew. Liu An was said to have been living with vegetarian monks who taught him the process of curdling and extracting Tofu.
‘Tofu’ comes from the Japanese word ‘tofu’, meaning fermented or curdled. Tofu in the west In the west, the first mention of the Tofu is found in a letter written by the English merchant James Flint to Benjamin Franklin in 1770, where he mentions a food he called ‘towfu’. According to the Chinese legend, the food was invented by Prince Liu An. This is most common among the three theories of the origin of this famous food. Tofu by accident One version of the accidental invention theory suggest that Liu An was the king of the Han dynasty who wanted to learn to cultivate dan. Dan is a term for the energy clusters of the body, and Liu An wanted to preserve it. He used spring water to make the soy milk so that he could hold on to his dan. However, the soya bean milk was accidentally mixed with gypsum powder and salt, and the soya milk became the solid chunk of tofu that we know today. Many people enjoyed the delicious bean curd, and it gained popularity in the households of the kingdom. Another theory suggests that tofu was invented accidentally when boiled milk was mixed with soybean powder and sea salt. Sea salts are rich in calcium and magnesium and would have curdled the mixture to obtain a crude form of the tofu that exists today. Still another theory suggests that the ancient Chinese learned the trick of curdling milk from the Mongolians and the East Indians, as no earlier knowledge of curdling existed in the Chinese culture. In China, it is used as an offering while visiting the graves of relatives. Their belief is that the deceased spirits have lost their chin and jaw, and so, they can eat only those foods that are soft and require no chewing.
Tofu in Chinese culture In Chinese culture, Tofu has a lot of significance apart from consumption. It is considered a cooling agent and used in traditional Chinese medicine. For the same reason, it also helps to detoxify the body and maintain ‘chi’ or the body’s energy equilibrium. Although there are no scientific proofs of these properties of Tofu, the Chinese have been continuing their tradition of making Tofu to maintain chi to this day.
In 1941, Henry Ford unveiled the ‘Soybean Car’, where he used a kind of plastic derived from soybeans, wheat and corn to make its body!
47 Smallpox Vaccine
Do you remember being vaccinated? What purpose does it serve to improve
our health? Let’s find out. Vaccination is the process of administering a harmless dose of microorganisms in the body so that its immune system can develop antibodies to prevent the body from further attack. It is a precautionary procedure, which is aimed at safeguarding the body from possible threat in the future. Polio, small pox and tetanus shots are examples of vaccination. Pox on you! Smallpox was a dangerous disease caused by the viruses, Variola major and Variola minor. The virus was more common among infants and small children. The symptoms included fever, sores, blisters and scabs, which often led to permanent scarring. Although its origin is unknown, the earliest
evidence of the epidemic was found in Egyptian mummies. In the eighteenth century, an estimated 400,000 people were believed to have died of the virus and approximately 500 million people have perished over 100 years of its existence.
Prominent monarchs from history who died from smallpox include, Queen Mary II of England, Emperor Joseph I of Austrai, King Luis I od Spain, Tsar Peter II of Russia, Queen Ulrika Elenora of Sweden and Louis VX of France. The extent of death resulting from smallpox was nothing short of a slow and gradual plague, claiming millions of lives in its wake. In 1967, there were 15 million cases of smallpox reported across the world. Such a deadly virus would have continued to claim millions of lives had it not been for a divine stroke of luck and the efforts of a man named Edward Jenner. Edward Jenner (1749–1823) was an English physician and scientist. He is credited to save the lives of millions of people through his invention of the smallpox vaccine in 1798. Born in Gloucestershire, England, Jenner was a firsthand witness to the death toll of smallpox, and it shaped a strong interest in medicine and science in his young mind. By 1770, he became an apprentice to the surgeon John Hunter at St. George’s Hospital. In his childhood, Jenner had been inoculated for smallpox, where an individual is deliberately exposed to the virus in a small dose and develops similar symptoms. The symptoms subside over the course of a few weeks, and consequently, the individual is believed to develop immunity against the virus. The first instance of immunity against smallpox was observed in the wife and children of Benjamin Jesty, who injected them with a cowpox vaccine during the 1774 smallpox epidemic. It is uncertain whether Jenner may have heard of these instances but he, one day, accidentally observed that milkmaids who were in constant contact with cows exhibited small blisters,
which were filled with pus. Jenner hypothesized that it was perhaps the cowpox virus that made them immune to smallpox. The experiment Jenner experimented with his hypothesis by scraping the pus from a milkmaid infected with cowpox and rubbing it on the arm of James Phipps, his gardener’s young son in 1796. This process was known as inoculation. The boy developed a fever which subsided and upon exposure to the smallpox virus, Jenner found that the boy was immune to the virus. In order to be absolutely certain, Jenner conducted the same test on an additional twenty-three subjects, including his own son. Prognosis After being convinced that his findings were not flawed, Jenner wrote about them and reported to the Royal Society. After much deliberation and delay, his findings were finally accepted, and by 1840, the British government took necessary measures to provide free and accessible cowpox vaccines to the public under the first Vaccination Acts of 1840. Later life and legacy Jenner continued working to improve on his vaccination and was granted a large amount of money for his research. He was appointed as the physician extraordinary to King George in 1821. During his lifetime, he was also made a foreign honorary member of the American Academy of Arts and Sciences. On 25 January 1823, he suffered a stroke and died the next day.
Winning over the Emperor.
It was said that Napoleon, the dmperor of France, also made sure all his troops were vaccinated, and he had the highest regard for Jenner. It was also reported that Napoleon released two Englishmen during the Napoleonic wars at Jenner’s request, as he could not possibly ‘refuse anything to one of the greatest benefactors of mankind.’ Jenner made a tremendous contribution to the advancement of medical science, and due to his invention, the World Health Organisation (WHO) declared that smallpox had been eradicated in 1979. Edward Jenner’s work paved the way for immunology. He is deservedly known as the ‘Father of Immunology’.
Ebola, Swine Flu, HIV, etc., are examples of other viruses that are life-threatening.
Skincare products are a necessity in winters. They nourish and protect our skin from dryness. Essential beauty products come in a variety of brands, and among these, Vaseline is a household name, which has been associated with beauty and skincare for more than a century. Vaseline is a byproduct of petroleum. It is formed during the process of distillation (the process of purifying a liquid by heating and cooling). It has a thick texture with an oily and paste-like quality when applied on the skin. The petroleum jelly product, which came to be popularly known as ‘Vaseline’, was invented by Robert Augustus Chesebrough in 1870. In 1872, he patented the process, which extracted a useful jelly-like substance from petroleum. Rod wax Robert Chesebrough was a London-born American chemist who worked in a distillery, which extracted kerosene from sperm whales. However, the discovery of petroleum caused a decline in the market for sperm-whale extracts, and in 1859, Chesebrough had to look for new avenues of making a
living, which led him to the oil fields in Titusville, Pennsylvania, to explore the potential of the oil industry. Through his interaction with workers in the oil rig, he came to learn of a substance called ‘rod wax’. This rod wax was a gel-like substance, which would form in the rig machinery, causing it to stop functioning, and had to be removed periodically. Nonetheless, rod wax had its uses too. The rig workers used this jelly on cuts and burns as a topical ointment to treat the skin. Chesebrough was intrigued by the substance and decided to take a sample home for further study.
Vaseline, a combination of the German word ‘vasser’, which meant ‘water’ and the Greek word ‘e’laion’, meaning ‘olive oil’. The trial Chesebrough continued working on the substance for the next decade, using his experience as a chemist. He would experiment on himself by applying the jelly on self-inflicted wounds and trace the healing process. He finally succeeded in refining it to a clear jelly-like substance, which he named Vaseline. In 1870, Chesebrough opened his first factory in Brooklyn, New York, and began marketing and peddling his ‘miracle’ in a horse-drawn cart and demonstrated the healing properties of ‘vaseline’. As he steadily gained success in the market, he filed for a patent in 1872 at the United States Patent Office in which he described at length the product and his procedure for making it. Vaseline Vaseline experienced a staggering increase in sales, and by 1874, around 1,400 jars of the jelly were sold in a day. This was only the beginning of the success story for Chesebrough, as the next decade saw the iconic rise of
Vaseline from a product of ‘rod wax’ to an essential item in American homes. With popularity also came the threat of imitation, as products resembling Chesebrough’s Vaseline were found in stores. In 1875, Chesebrough founded the Chesebrough Manufacturing Company. In order to prevent imitations of his product, he introduced the Blue Seal to mark his bottles and jars. This seal continues to be a mark of product authenticity to this day. In 1883, Queen Victoria conferred a knighthood to Chesebrough. It was said that Vaseline was personally used by the queen. Thirty-nine years after his first invention of the jelly, Sir Chesebrough retired from his position as the president of the company in 1908. Healing balm As the twentieth century dawned, it came with many achievements as well as destruction for mankind, including the First and Second World Wars. In some of these historical events, Sir Chesebrough’s product played an integral part. In 1909, the American explorer and United States Navy officer, Robert E. Peary Sr. led the first successful expedition to the North Pole on 6 April 1909. He apparently carried Vaseline on his expedition to prevent his skin from ice burns because Vaseline does not freeze in cold temperatures. During the First World War, soldiers of the United States carried tubes of the healing jelly to treat cuts and wounds and prevent sunburn and for barter exchanges with British soldiers. On 8 September 1933, at the age of ninety-six, Sir Robert A. Chesebrough died at his home in Spring Lake, New Jersey. However, his legacy was far from over. A decade later, Vaseline made news headlines in The New York Times. It printed the story of how the sterile Vaseline commissioned by the Surgeon General of the United States helped to save the lives of wounded US soldiers.
In 1955, Chesebrough Manufacturing Company merged with Pond’s
Extract Company, an American beauty-product company founded by pharmacist Theron T. Pond in 1849 and became Chesebrough-Pond’s. In 1987, the company was acquired by Unilever, an Anglo-Dutch company, which went on to launch personal care products using the name Vaseline. Magic of the jelly Vaseline has not only healed people but inspired other inventions. Thomas Lyle Williams was said to have seen his sister, Mabel, use a coat of Vaseline mixed with coal dust. It created an effect similar to that of modern mascara. On developing the product, he founded Maybelline, named after his sister. The company went on to become a leading brand in cosmetic products and remains so to this day.
The American rock band, The Flaming Lips, incorporated the jelly in their song ‘She Don’t Use Jelly’ in 1993. The British pop group Elastica released their song ‘Vaseline’ in 1995. The most popular song remains the 1994 song ‘Vasoline’ by the American band Stone Temple Pilots. The inspiration does not end there. Sir Chesebrough’s Vaseline ended up inspiring not only beauty products but popular culture as well. The 1990s saw a resurgence of popularity when vaseline was woven into several song lyrics. Matthew Barney, an American artist, used petroleum jelly in his art. Boxers and fighters are allowed to apply Vaseline on their faces before a fight to make it difficult for their opponents to land a punch. The ‘magic’ jelly of Sir Chesebrough has indeed worked its magic and has introduced the innovative idea of beauty and healthcare, working hand in hand for the benefit of consumers. The company he founded on the back of his horse and cart continues to thrive more than eighty years after his death and 149 years after its establishment. The dedication with which he served
his company has given it the steady foundation required to move ahead into the twenty-first century.
In 2009, there was a fight between Georges St-Pierre and BJ Penn in the Ultimate Fighting Championship. It was rocked by a controversy when Penn claimed that St-Pierre had rubbed Vaseline on his back and shoulders to make it difficult for his opponent to grapple him.
Visit your local shops to find out how many Vaseline products are being sold. You will see how Chesebroug’s legacy is still alive.
Velcro is a strap that uses tiny hooks and loops to bind two strips together.
It is the ‘zipperless zipper’, and is used in many products like children’s shoes, boxing gloves, laptop bags, etc. To the curious and inquisitive mind, a simple walk along the mountainside holds the potential for new discoveries and inventions. George de Mestral (1907–90) and the story of how he invented the Velcro strap is one such example. Let’s get on with the story, shall we? George Mestral was born in Colombier, Switzerland, on 19 June 1907. He was an engineer by profession and loved hunting. In 1941, he was returning home from a hunt with his dog in the Alps.
Mestral found his trousers and his dog’s fur covered with burdock burs (a kind of herb), which stuck fast and was very difficult to remove. It aroused his curiosity and upon further examination under a microscope, he found that the burs had hooks, which caught easily onto any material that had a loop, like his clothing or his dog’s fur. Mestral was quick to see the possibility that had been presented and began thinking of ways to replicate this natural adhesive. Mestral tried to replicate the process on cotton strips but the fabric was too soft and wore out easily. He began thinking of an alternative and decided to use the strongest synthetic fibre he knew—nylon. Fortunately, nylon proved to be a suitable choice, as he discovered that sewing nylon under hot infra-red light formed the perfect hook he needed. The next challenge was to produce the loop side. Mestral came up with the method of weaving nylon loops and heating it to make the looped side as the nylon retained its shape when heated. Perfecting the invention It took eight long years for Mestral to mechanize the process of the weave and one year to build the loom for trimming the woven loops. The whole process of inventing his product took ten years after which Mestral filed a patent in 1951, in Switzerland, and it was granted in 1955. Mestral named the product ‘Velcro’. The success of his Velcro made it possible for him to open his shop in various countries. Although the concept was a success, it took a longer time for the textile industry to accept the easy-to-remove straps. However, the aerospace industry made use of the hook-and-loop concept of ‘velcro’ as it helped astronauts in wearing and removing their spacesuits. Mestral tried to update the patent for his invention, but it had already expired in 1978. This led to the influx of cheap and inferior copies flooding the market from China, Taiwan and South Korea. The hook-and-loop invention of Mestral is an example of bionics, where man draws direct inspiration from nature or tries to copy it. Velcro is not without its fair share of advantages and disadvantages. It makes a tearing sound when opened and can be a deterrent against pickpocketing. It also relieved us from the tedious task of tying shoelaces once shoes with Velcro straps were invented. However, it tends to accumulate dust, hair, fur, etc., in
its hooks, which are very difficult to remove, giving the Velco straps an untidy and unkempt appearance. It can also get stuck on other fabrics.
The name ‘Velcro’ is a portmanteau of the French words ‘velours’, meaning ‘velvet’, and ‘crochet’, meaning ‘hook’.
50 Vulcanized Rubber
Vulcanization is the process by which natural rubber is treated with
chemicals to produce a hardened and more durable rubber. Natural rubber which has undergone the process of vulcanization is called vulcanized rubber. The story of Charles Goodyear (1800–60) and his invention is that of perseverance in the face of misfortunes. Born on 29 December 1800 at New Haven, Connecticut, Goodyear was a self-trained manufacturing engineer and chemist. In 1814, he went to Philadelphia to learn the trade of hardware business and returned when he was twenty-one to work in partnership with his father. They opened a business manufacturing agricultural supplements and metal buttons. The business was doing well until 1929 when his health deteriorated and his business collapsed. When life gives you lemons, experiment!
In 1831, Goodyear first came to know about natural rubber, also known as ‘gum’. After his meeting with the manager at Roxbury Rubber Company in Boston, he began experimenting with the gum to improve the performance of rubber products. Upon his return to Philadelphia, he was arrested because of his unpaid debts. He used the duration of his imprisonment to test several qualities of rubber. His first subject was India rubber, which was inexpensive and more affordable than the other rubbers in the market. Goodyear heated the rubber and applied a powder of magnesia (magnesium oxide, MgO), which seemed to remove the stickiness of the rubber and produced a white compound. He began making shoes using the compound. However, it was not long before he realized that the shoes were not practical because the gum regained its stickiness. Not many inventors would have taken the amount of risks he took or sacrificed as much as he did. Goodyear sold all his furniture and relocated his family to a boarding house. He then went to New York to try his fortune and continue his experiments with several failed attempts. He took up the clothmaking industry, with the help of an old friend, and manufactured rubber shoes, life-preservers and a variety of rubber goods. Things were finally looking good again, and he managed to relocate his family to Staten Island and owned a home once more. Yet again, he was struck with misfortune as a financial crisis of the time sunk his entire fortune. Unwilling to let misfortunes get the best of him, Charles Goodyear set out once again to Boston and through the goodwill of friends and acquaintances, he continued his experiments, trying to find a way to make rubber withstand both hot and cold temperatures. The silver-lining In 1839, it was said that Goodyear had been working on rubber and sulphur when he accidentally dropped the rubber in a flame. He observed that the rubber, instead of melting, started to increase in strength. When he least expected it, the answer that Goodyear had been looking for seemed to fall right into his lap.
Vulcanization is named after Vulcan, the Roman god of fire. Testing the rubber further, he saw that it was able to remain stable in both hot and cold temperatures. He continued testing the process. Five years after the accidental invention, he finally filed a patent for the process he had stumbled upon as ‘vulcanization’. Goodyear wanted to breach the British market and gain the British patent for his process. He sent samples of his product, one of which fell into the hands of Thomas Hancock in 1842. Hancock then filed a patent and the two of them were entangled in a lawsuit, which he lost, and Hancock was granted the patent. Charles Goodyear did not have an easy life despite such an important invention, and at the time of his death, he was still burdened with debts. On 1 July 1860, he was on his way to see his dying daughter in New York when he was told that his daughter had already passed away. He collapsed and passed away at the age of fifty-nine. Legacy In 1869, his son, Charles Goodyear Jr., carried the family name forward when he invented the ‘Goodyear welt’. It is a strip of rubber, plastic or leather, which runs along the interior of a shoe. Goodyear may have died with debts but the royalties from his invention provided financial security to his family. The Goodyear Tire and Rubber Company was founded in 1898 by Frank Seiberling and was named in his honour. On 8 February 1976, he was inducted into the National Inventors Hall of Fame.
‘Ficus elastica’ is the scientific name for the rubber plant. The rubber is produced from the sap of the tree.
Have you ever come across the saying: ‘Beauty is only skin-deep’? X-rays
have proven the expression to be true indeed. If you go for an X-ray scan, the result you get is a blurred image of the insides of your body, highlighting the bones. This is done to detect any problem inside our body or to identify broken bones, etc. What are X-rays? X-rays are electromagnetic radiations, which are part of the electromagnetic spectrum, including visible light, microwaves, radio waves, UV rays, etc. They have wavelengths, which is shorter than visible light. They range from 0.01 to 10 nanometres and pass through objects depending on their density.
These radiations are also known as Röntgen Radiation, named after the scientist who discovered them. The non-fictional man with X-ray vision Long before the fictional superhero, Superman, and his X-ray vision became popular, there was a man named Wilhelm Conrad Röntgen. He was born at Lennep, Germany, in 1845. Three years later, the Röntgens moved to Apeldoorn in the Netherlands. There, the young Wilhelm began his education at a boarding school. Although not an outstanding pupil, Röntgen loved nature and was very adept at building things. This passion, remained with him throughout his life. At the age of seventeen, he enrolled at a technical school in Utrecht but was expelled after being wrongly accused of a crime he did not commit. In 1865, he enrolled at the Federal Polytechnic Institute (ETH Zurich) in Zurich and studied mechanical engineering. He became a favourite of Professor August Kundt and Rudolf Clausius, both of whom were renowned German physicists teaching in Zurich. Röntgen went on to earn his PhD from the University of Zurich in 1869 and was appointed as assistant to Kundt and the pair moved to Wurzburg. Röntgen’s career advanced rapidly with various positions at several esteemed universities. Finally, he moved to the University of Munich in 1900 due to a special request from the government. Röntgen made plans to emigrate to the United States of America and had even accepted an appointment at Columbia University in New York. However, the First World War broke out in Europe before he could undertake the journey, and he had to cancel his plans and stay back in Munich. Ray of hope For several decades, Röntgen had been studying the effects of electric current in a cathode ray tube. During the time, electricity was a fairly recent phenomenon, and there was much to be explored. In 1895, while working with Lenard’s tube, Röntgen left a small opening made of aluminium sheet at one end of the tube to allow the cathode rays to exit. This aluminium sheet was covered with cardboard to prevent the aluminium sheet from damage. The cardboard covering prevented light from escaping the tube.
A Cathode Ray Tube is the tube-shaped part in a television or computer screen. Inside it, a continuous flow of electrons is produced to create the images or text. Röntgen was surprised to see a faint tinge of fluorescence on a screen at a distance, facing the aluminium sheet in the tube! He found that the screen was coated with barium platinocyanide, an inorganic chemical compound. He replaced Lenard’s tube with a Crookes-Hittorf tube because it had a much thicker glass. He wanted to see if the tube would produce the same effect. On 8 November 1895, Röntgen connected the Crookes-Hittorf tube to the Ruhmkorff coil, which is an induction coil used to produce high electrostatic charge. As he switched off the light to ensure that the tube was opaque, Röntgen noticed that a bench in the corner of the laboratory had started to give a fluorescent shimmer. The screen coated with barium platinocyanide he intended to use for the experiment had been placed near the bench. It was this screen that was causing the bench to shimmer. Röntgen deduced that whatever was causing the barium platinocyanide screen to glow must be a new and unknown electromagnetic ray and named it ‘X-ray’, as ‘x’ in mathematics means the unknown factor. Röntgen continued experimenting with X-rays and used other materials to see if any of them would be able to stop the rays. After several attempts, he decided to use a sheet of lead. He was astonished to see the skeletal shape of his hand with all the bones in the figure visible on the coated screen.
Röntgen had been working on vacuum tubes which were used by scientist like Nikola Tesla, William Crookes, Johann Hittorf and Philipp von Lenard. This was the first X-ray! Since there had never been any reference to the image of human bones, Röntgen could not be sure. He carried on his experiments secretly until he could obtain substantial proof of his discovery. Anna’s hands After a few weeks of secretive experiments, Röntgen asked his wife, Anna Bertha, to be a part of his experiment. He made her place her hand between the tube and the coated screen and took an X-ray image. She was horrified by what she saw—the ghostly outline of the bones of her fingers. She could also clearly see the shadowed outline of the wedding band she wore on her fingers. He also took a picture of his friend, Albert von Kölliker, and displayed it when he presented his findings. He was now confident that this was a groundbreaking discovery and published three papers on X-rays. These X-rays proved indispensable to diagnostic medicine and radiology as the rays pass through human tissue and reveal broken bones, bone tumours or anomalies in bone structure. Using X-rays, surgeons were able to locate the precise location of fracture or broken bones and mend them. Röntgenium What followed was a series of awards as recognition for this great invention. In 1901, Röntgen became the first recipient of the Nobel Prize for Physics. In 2004, element number 111 was named ‘Röntgenium’ in his honour. Although his discovery of X-rays was accidental, he was not the first to notice that certain rays make some objects appear fluorescent. However, he was the first to go down the rabbit hole. Dr Wilhelm Conrad Röntgen died of intestinal cancer at his home in Weilheim near Munich on 10 February 1923. 8 November, the day on which he made his discovery, is celebrated as World Radiology Day.
We are asked to remove all jewellery before an X-ray as the radiation cannot penetrate the metals and prevents getting a clear picture of the anatomy.
Conclusion Did you enjoy learning new things about the inventions that you are already familiar with? These are only fifty-one of the many inventions made by man. Isn’t it amazing how much we can accomplish with a little luck and a lot of hard work? Someday, someone might write about something you invented too! All you have to do is keep an open mind and learn all you can. After all, knowledge never hurt anyone. The inventions that we have just read about, for instance, that of Percy Spencer was not just born from pure luck. No! He painstakingly acquired the skills he would need in the future to identify a golden opportunity when it would strike him. He was ready to be surprised! We must do the same. We have discovered gravity, we have learned to fly using machines we have built, we have conquered the seas and hoisted a flag on the moon! Do we need more proof to remind us that anything is possible? Let us remember the words of one of the greatest scientists of our time, Albert Einstein: ‘There are only two ways to live your life. One is as though nothing is a miracle. The other is as though everything is a miracle.’ So, are you ready for yours?
Acknowledgements I would like to express my sincerest gratitude to the following people. The dynamic team at Rupa without whom this book would never have seen the light of day. Much like the many inventions, which I suspect are lost to humanity because the lucky people who stumbled upon the ideas never explored them further. Siam, for providing such beautiful illustrations. Lxl Ngaihte, for saving me at the eleventh hour. Sam, my husband, who kept the coffee-watch as I blasted away on my laptop. He made sure I stayed awake to write. My cats, who provide the much needed distraction in moments of selfdoubt.