Morphing Drones, Automatic Fabrics, Robot Birds, With Ralph Bond, Science/Tech Trends!

Both Segments: Ralph Bond. Science/Tech Trends Correspondent.

Today I’m available to provide spot project support in PR, podcast production, and general marketing communications.  My 10 years of service to Autodesk introduced me to the world of Architecture, Engineering and Construction [AEC] technology.  I’m always open to project work in that field. 

March 2019 show notes

Story 1:  Scientists invented a fabric that knows whether to cool you down or warm you up

Source:  BGR Story by Mike Wehner

Link: https://bgr.com/2019/02/10/heat-regulating-fabric-invention-research/

heat regulating fabric

Image Source: Faye Levine, University of Maryland

Dressing in layers is usually the easiest way to ensure that you won’t be uncomfortable at any point in the day. If things get too hot, just shed a layer and you’re good, and if you get chilly again, just slip it back on. But what if you didn’t have to do that at all? What if your clothing could tell if you were too hot or too cool and adjust accordingly?

That’s exactly what researchers from the University of Maryland seem to have accomplished with an incredibly unique kind of new fabric that actually changes depending on your body temperature.

The invention, which is described in a new research paper published in Science, actively manages the amount of infrared radiation (that’s just heat) that passes through it. The fabric is capable of this remarkable feat thanks to metal-coated strands that react to the amount of heat passing by them.

The strands are made up of two different types of material, neither of which you’ll find in nature. One of the thread materials absorbs moisture while the other repels it. This might sound like a bad idea, but it actually gives the fabric some very unique properties.

When the fabric is close to skin that is sweating, the dual-action threads begin to warp from the moisture. As the strands change shape, it allows more heat to pass through the layer of fabric while also altering the properties of the coating itself, promoting the escape of infrared radiation and rapidly cooling the skin. 

The opposite is then also true when the fabric cools down, with moisture evaporating from the skin and threads, and the fabric returns to its original, warmer configuration.

“The human body is a perfect radiator. It gives off heat quickly,” co-author Min Ouyang said in a statement. “For all of history, the only way to regulate the radiator has been to take clothes off or put clothes on. But this fabric is a true bidirectional regulator.”

It will still be a while before such fabric makes its way into consumer items, but the research shows a lot of promise, and it might not be too long before our clothing can monitor our comfort without us doing a thing.

Story 2:  Self-folding drone could speed up search and rescue missions

Source:  CNBC Story by Chloe Taylor

Link: https://cnb.cx/2IJdEae

https://fm.cnbc.com/applications/cnbc.com/resources/img/editorial/2019/02/18/105745626-1550488572478gap_traverse.1910x1000.png?v=1550488603

Photo: University of Zurich

See Super Nifty Video here: https://www.youtube.com/watch?v=jmKXCdEbF_E

A drone that can change shape in flight has been developed by researchers at the University of Zurich to assist with search and rescue missions.

Able to contract and fold, the aircraft can enter small cracks and spaces to stream footage to rescue teams via its two integrated cameras. The drone is designed for use in areas in disaster zones that become inaccessible to rescuers due to safety concerns or physical restrictions.

Davide Falanga, one of its developers, told CNBC on the phone that the drone could make rescue missions more efficient and effective.

“(This drone could have) multiple impacts – it can go into areas that would otherwise be inaccessible,” he said. “In the aftermath of an earthquake it could let rescuers enter and explore a collapsed building. We used the most efficient and stable systems allow it to fly longer, and have held public demonstrations in realistic scenarios which showed that this is a feasible product.”

Funded by the Swiss National Science Foundation, the project took six months to go from concept to prototype. However, as the drone is still in early development stages, its developers have no timescale for a wide rollout.

“We would be open to commercializing and discussing opportunities with investors, but at the moment there’s no commercial plan,” Falanga told CNBC. “We have sometimes had to tinker with the system when we’ve deployed it — we want it to be deployed and work immediately, so we need about six months to a year to improve the drone and make it more robust so it can work in more scenarios. But the idea itself is definitely feasible.”

According to the research paper written by the drone’s developers, their aircraft “could lead to a shift in the research community towards morphing aerial vehicles.”

However, they noted that there were still several unsolved research questions, such as “automatic morphology selection,” which refers to the robot’s ability to autonomously take the best shape for the task at hand.

Unsolved problems

Mohan Sridharan, senior lecturer at the University of Birmingham’s School of Computer Science, told CNBC via email that foldable drones were being explored by the wider robotics research community, with several concepts currently in development.

“This would indeed help in disaster response, but the stable navigation of such a drone is not a solved problem,” he said. “Also, complex applications such as disaster response pose other challenges related to perception, reasoning, and communication.”

Maria Kamargianni, lecturer in transport and energy at University College London, told CNBC on the phone that privacy concerns would need to be addressed before the drone could be commercialized.

“This is a very promising technology for search and rescue projects, and it’s much more economically viable than existing options. In circumstances where a helicopter or a drone could be used, a drone would be much cheaper to deploy,” she said.

“The technology has lots of other applications as well — for example, it could be used to examine the quality of materials on a collapsed bridge. But technologies must be developed in line with public acceptance of them, so these drones should be designed in a way that notifies the public they are being used by the authorities — this could be done by using distinctive colors. In rolling them out companies would also have to make sure they are not violating personal data regulations.”

Story 3:  This wireless AI camera runs entirely on solar power

Source:  The Verge Story by James Vincent

Link: https://www.msn.com/en-us/news/technology/this-wireless-ai-camera-runs-entirely-on-solar-power/ar-BBTEnko?ocid=News

Note: be sure to check out the video at the end of the article [use link]

A big trend in AI is the transition from cloud to edge computing. Instead of AI devices doing their computation remotely via an internet connection, they’re increasingly handling things locally, with algorithms working directly on-device. Benefits of this approach can include faster results, greater security, and more flexibility. But how far can you push this model?

***My note: definition of edge computing – Edge computing

Edge Computing is pushing the frontier of computing applications, data, and services away from centralized nodes to the logical extremes of a network. It enables analytics and knowledge generation to occur at the source of the data. This approach requires leveraging resources that may not be continuously connected to a network such as laptops, smartphones, tablets and sensors.

Seattle-based startup Xnor is certainly right at the bleeding-edge. This week [story posted Feb. 15] the company unveiled a prototype AI camera that runs entirely off solar power — no battery or external power source required. The camera has a piddling 320 x 320 resolution, an FGPA chip to handle processing, and is loaded with a state-of-the-art object recognition algorithm.

You could, theoretically, stick a device like this anywhere outdoors and have it transmit data back to you indefinitely. It’s compatible with a few different low-energy wireless communication protocols (because Wi-Fi drains too much battery) which let it send information over tens of kilometers. And, says Xnor, if you fit it with a battery, it can store enough power during the day to keep it running during low-sunlight hours and at night.

“We’re investigating several use-cases for these devices right now,” Mohammad Rastegari, Xnor’s CTO, tells The Verge. “From large-scale civilian projects, to monitoring inside the cabins of autonomous cars, to attaching cameras to drones.”

Xnor has a strong background in this sort of AI miniaturization. It was spun off from the Allen Institute for Artificial Intelligence back in 2017 on the back of a proprietary method for creating super-efficient machine learning systems.

(Key to this technology is a type of logic circuit known as an XNOR gate, hence the name.) It’s also proved the utility of its software by running them on low-power, low-compute devices like the Raspberry Pi Zero.

This solar-powered AI camera is not yet ready to sell though. Although it’s entirely self-contained, there are some constraints on its operation. For example, the amount of frames it can process each second depends on how much sun it’s getting. Xnor says that on a sunny day it runs at 32fps, but this can be compensated for with bigger solar cells.

What’s clear is that devices like this are only going to become more common in the future. They’re relatively cheap (Xnor’s model costs $10), and more convenient for operators. And, as the photos and videos they take never leave the device, they’re potentially more private.

But a question remains: are we comfortable with a world full of AI eyes that are always watching? It’s a problem we’re already grappling with in the context of smart CCTV, and as Xnor’s work shows, the technology is only going to get smaller and more unobtrusive.

Story 4:  This Birdlike Robot Uses Thrusters to Float on Two Legs

Source:  Wired Story by Matt Simon

Link: https://www.wired.com/story/this-bird-like-robot-uses-thrusters-to-float-on-two-legs/?mbid=email_onsiteshare

https://media.wired.com/photos/5c5a14097da0672ca43ed8ed/master/w_799,c_limit/terminator.jpg

See video here: https://www.youtube.com/watch?v=CFM2dRJkkzk

WE HUMANS ENVY birds for their seemingly effortless ability to fly, and for their ability to extract endless amounts of bread from old people in parks. But there’s a middle ground between those two states—soaring and ambling around on two feet in pursuit of crumbs—that we tend to overlook: Birds are a kind of hybrid. To walk over difficult terrain, they can flap to stabilize, while we humans trip over ourselves.

When roboticists look to nature for inspiration, they overlook this middle-ground state, developing robots that either walk or fly—but not both. “Everyone is working on drones or bipedal locomotion,” says Caltech roboticist Soon-Jo Chung. “We want to lay the groundwork for combining these two different types.”

Chung and his colleagues have debuted a strange machine called Leonardo (LEg ON Aerial Robotic DrOne, obviously), which walks on two legs but also uses thrusters attached to its torso to move like a MechWarrior.

Its main mode is a kind of hover-walk. If its engineers have anything to say about it, this wild new kind of locomotion could help load injured humans into drone ambulances, or even explore the surface of Mars. Baby steps, though: The researchers are still experimenting with Leo on a tether, so it won’t be scrambling up mountains anytime soon.

At two and a half feet tall and just six pounds (thanks to carbon fiber construction), Leo looks something like a whooping crane. It can walk around on lanky legs, just as a traditional bipedal robot like Boston Dynamics’ Atlas would (minus the backflips), but unlike a traditional robot it can use its thrusters as a failsafe. Sure, Atlas can frantically stumble to regain its footing, but that’s no guarantee it’ll save itself.

“You can take a couple of steps to avoid falling, but what if that fails?” asks Northeastern University roboticist Alireza Ramezani, designer and developer of Leo. “The thrusters in this scenario can make the system almost fault tolerant.” They offer a backup plan in case of trouble, allowing the robot to switch seamlessly between legs and thrusters depending on what’s most useful in the moment.

Combining flight and walking is no easy feat. Roboticists have been working toward a mastery of bipedal locomotion for decades, and the machines still don’t casually stroll among us. One challenge is that it takes a lot of energy just to get a two-legged robot to balance in place—even when it’s standing still, a bipedal robot has to constantly make corrections (you do the same, you just don’t notice it).

The promise of Leo is that it’ll be more efficient at this kind of idling. Instead of wobbling to correct itself, the biped can switch on the thrusters to become a drone on legs: The propellers won’t have to waste energy lifting the robot, per se, because the feet will still be in contact with the ground. “It can kind of defy the rules of gravity in a sense,” says Leo codeveloper Morteza Gharib, director of the Graduate Aerospace Laboratories at Caltech.

Another challenge of bipedal locomotion: The inherent instability of walking on two legs means robots struggle on uneven terrain. This became abundantly clear a few years ago during the Darpa Robotics Challenge, in which humanoid robots succeeded mostly in just falling on their faces. One wrong step, or a slight shift in the surface beneath its feet, and the bot hits the ground. That’s particularly problematic for bipedal robots because they aren’t yet dexterous enough to pick themselves back up.

Leo, on the other hand, would essentially float over difficult terrain, making missteps less likely to be catastrophic. If things get really hairy, for instance if the robot needs to get up a hill, it might be able to bound instead of scramble. “How can we actually leverage the legs to a point where you can turn on the thrusters and manage a jumping transition to flight?” asks Ramezani.

***Note: I will stop here and encourage listeners to read the entire article

Which brings us to Mars—at least one day. Caltech has been collaborating with NASA on a helicopter for the Red Planet, which could bypass difficult terrain and scoot around more quickly than a wheeled rover. The nice thing about a rover, though, is that it can sit on the surface without draining its batteries. To hover, a drone has to use power.

“The Mars helicopter concept is going to be very limited,” says Gharib. “We realized we need to have a much longer flight time and also be able to stabilize and take samples or look at rocks.” Now Gharib and his colleagues are exploring how a robot like Leo might navigate the rough Martian landscape more efficiently than a traditional helicopter by putting its feet on the ground and lightly powering up the thrusters.

They’re also looking into how Leo might be of use here on Earth as a sort of robot companion to a flying ambulance (a small-scale version of which Caltech is testing against a wall of 1,300 computer fans, by the way), which could carry a person out of an otherwise inaccessible area. The problem, of course, is in getting an incapacitated person into the ambulance without another human on the ground helping. But teams of Leo robots may one day take up that job.

To be clear, Leo isn’t meant to be the only bipedal machine in a world that will soon be crawling with robots. It will have its use cases, as will traditional humanoid robots like Atlas. You wouldn’t want a robot like Leo to be buzzing around your home, after all. But this new class of bipeds could well find a footing on Earth—and beyond.

Story 5:  MIT scientists are using lobsters to develop a new form of flexible body armor

Source:  The Washington Post Story by Peter Holley

Link: https://wapo.st/2H0UmL6

https://www.washingtonpost.com/resizer/EiYqOIhXtiRhpTURxu7SYxkRuH0=/1484x0/arc-anglerfish-washpost-prod-washpost.s3.amazonaws.com/public/XNUKSLWT3AI6PGWZZIDBT3P2AU.jpg

Imagine a highly sophisticated body armor that is tough — a shield that consists largely of water, but is strong enough to prevent mechanical penetration — while being flexible enough for a wearer to easily move body parts, while swimming in water, walking across the ground or rushing to escape danger.

That description might sound like a suit worn by a fictional hero in the DC Comics franchise, but it actually describes portions of a lobster’s exoskeleton.

Researchers at the Massachusetts Institute of Technology and Harvard say the soft membrane covering the crustacean’s joints and abdomen — a material that is as tough as the industrial rubber used to make car tires and garden hoses — could guide the development of a new type of flexible body armor for humans, one designed to cover joints such as knees and elbows.

The researchers’ findings appeared in Acta Materialia.

“We think this work could motivate flexible armor design,” Ming Guo, d’Arbeloff Career Development Assistant Professor in the mechanical engineering department at MIT, told MIT News, noting that lobsters’ membrane has helped them survive on Earth for more than 100 million years.

“If you could make armor out of these types of materials, you could freely move your joints, and it would make you feel more comfortable.”

Ballistic vests — commonly referred to as “body armor” — are widely used by law enforcement officers and have been credited with saving thousands of officers from handgun and rifle ammunition, according to the National Institute of Justice.

But the vests come with challenges as well. As The Washington Post’s Devlin Barrett reported in 2017, Kevlar — tightly woven fiber panels designed to stop bullets from handguns — have expiration dates and usually last no more than five years. Body armor can also be ill-fitting, particularly for female officers who sometimes require custom fitting, according to NIJ.

Some studies have shown that body armor can also impair the wearer’s marksmanship and focus, as well as increasing “the physiological cost to complete a task when on duty,” simultaneously providing protection and increased risk, according to the National Center for Biotechnology Information.

MIT researchers say that lobsters could offer a solution to the problem plaguing most modern body armors: the more mobility an armor offers, the less it protects the wearer’s body.

Guo told MIT News that the idea for developing body armor inspired by lobsters arrived while he was eating one and noticed that the transparent membrane on the animal’s belly was difficult to chew. Unlike the crustacean’s bone-like outer shell, the animal’s softer tissues remained a mystery, he said.

Once researchers began to dissect those tissues, they made a surprising discovery. Making significant cuts into the membrane didn’t affect the material’s elasticity. Researchers determined that the elasticity and strength is due to the membrane’s unique structure, which includes tens of thousands of layers that they compare to plywood. The fibers within those layers help the material dissipate energy when it’s under stress, making it “damage tolerant,” researchers wrote.

“The knowledge learned from the soft membrane of natural lobsters sheds light on designing synthetic soft, yet strong and tough materials for reliable usage under extreme mechanical conditions, including a flexible armor that can provide full-body protection without sacrificing limb mobility,” the study added.

Guo told MIT News that material designed to replicate the strength and flexibility of lobster membranes could also be used in soft robotics and tissue engineering.

Story 6:  Award-winning serial entrepreneurs with previous exits to [from?] Facebook, Google and Apple have worked hard with AI scientists, computer vision engineers and product designers in complete stealth mode to create the revolution in Home Security and Family Safety.

Source:  Company’s website

Link: https://cherryhome.ai/  Go to the “how it works section”

cherry sensor

Cherry Home is an intelligent device which can help people with Parkinson’s, dementia, stroke, and other common conditions to age safely in place.

Cherry’s sensors monitor activity and send notifications whenever normal patterns change or if there’s a dangerous event such as a fall or a significant stumble.

https://caregiving.cherryhome.ai/static/media/image-1.2f1e9e0c.jpg

In addition to face recognition, Cherry distinguishes people by gait, body proportion, and the colors of their clothes. Therefore, even if you turn away from the sensor, the system will recognize you from the back.

https://caregiving.cherryhome.ai/static/media/image-2.77058b19.jpg

Cherry creates behavioral patterns for each family member in a particular location, at a specific time. The anomaly detection allows the system to immediately spot and classify unusual events and bring them to your attention.

https://caregiving.cherryhome.ai/static/media/image-3.891e447a.jpg

Cherry’s system detects and then reacts to cries for help from children, adults and even barking dogs to draw your attention to an emergency. Our system can also detect loud sudden, unusual noises such as gunshots.

Audio notifications can be accompanied with visuals to give you better understanding of what happened. With the help of sound-based event detection Cherry can register anomalies which happened outside of the sensor visible coverage.

https://caregiving.cherryhome.ai/static/media/image-4.5e9b9c55.jpg

Advanced AI-powered logic to understand complex events and reduce nerve-racking false positives.

Cherry computer vision precisely detects falls of all kinds, including falls behind objects, such as a couch. Unlike wearables, Cherry can’t be triggered by abrupt movements, which significantly reduces false positives.

Story 7:  NASA names facility after mathematician who inspired ‘Hidden Figures’

Source:  ABC News Story by staff

Link: https://bit.ly/2BRiLzv

a person posing for the camera: Physicist Katherine Johnson poses in the press room at the 89th annual Academy Awards in Hollywood, Calif., Feb. 26, 2017.

© Jason LaVeris/FilmMagic/Getty Images, FILE Physicist Katherine Johnson poses in the press room at the 89th annual Academy Awards in Hollywood, Calif., Feb. 26, 2017.

Katherine Johnson, the “human computer” whose work was depicted in the 2016 film “Hidden Figures,” was recognized on Friday [Feb 23] as the agency renamed a building after the pioneer.

The National Aeronautics and Space Administration redesignated a building that houses programs essential to safety on space missions in the 100-year old mathematician’s native West Virginia as the Katherine Johnson Independent Verification and Validation (IV&V) Facility.

a large green field with trees in the background: NASA's Katherine Johnson Independent Verification and Validation Facility in Fairmont, West Virginia.

NASA NASA’s Katherine Johnson Independent Verification and Validation Facility in Fairmont, West Virginia.

“I am thrilled we are honoring Katherine Johnson in this way as she is a true American icon who overcame incredible obstacles and inspired so many,” NASA Administrator Jim Bridenstine said in a statement. “It’s a fitting tribute to name the facility that carries on her legacy of mission-critical computations in her honor.”

The building aptly houses programs that support NASA’s highest-profile missions “by assuring that mission software performs correctly,” according to an agency statement.

In her three decades at NASA and its predecessor agency, the National Advisory Committee for Aeronautics, Johnson calculated trajectories for space missions including Alan Shepard’s Freedom 7 mission in 1961, John Glenn’s Friendship 7 mission in 1962 and several Apollo missions.

As a black woman, Johnson and fellow black mathematicians Dorothy Vaughan and Mary Jackson shattered racial and gender barriers and stereotypes during the height of the Civil Rights Era. Their story was documented in the 2016 film “Hidden Figures,” which was based on the book by Margot Lee Shetterly. Johnson was portrayed by Taraji P. Henson.

In 2015, Johnson received the Presidential Medal of Freedom. In 2017, NASA honored her with its dedication of the Katherine Jonson Computational Research Facility in Hampton, Virginia.

Story 8:  This website uses AI to generate startling fake human faces

Source:  c/net Story by Jackson Ryan

Link: https://www.cnet.com/news/this-website-uses-ai-to-generate-startling-fake-human-faces/

See video here: https://www.youtube.com/watch?v=kSLJriaOumA

When you visit the website “This Person Does Not Exist” you will likely see a face smiling back at you. Seems innocent enough — until you realize the face is not actually real, but generated by a neural network algorithm.

That person is not real. They don’t exist.

The website’s neural network algorithm codes a “facial image from scratch from a 512 dimensional vector”, according to Phillip Wang, who created and posted about it in a Facebook group on Feb. 12. Wang suggested he created the site to “raise awareness for what a talented group of researchers made at Nvidia over the course of 2 years,” according to a post in Hacker News.

The technology is based on a state of the art Nvidia-designed AI known as StyleGAN — a neural network that can separate aspects of an image to learn and generate new images. It was detailed by a team of Nvidia engineers in a pre-print paper last updated on Feb. 6 at arXiv. The neural network is versatile enough that it is not just faces that it can conjure up, but bedrooms, cars and even cats.

I ran through a couple of refreshes and generated a whole range of faces that looked convincingly real. A small child even popped up and I kind of let out a weird “aww”. Then I realized I was cooing over a computer-generated nobody. Super weird.

There are also occasions where strange artifacts appear on faces. I saw teeth between eyes, gaping mouths and strange swirls of red appearing on cheeks and brows.

Amazingly, others have taken the StyleGAN architecture and run with it, creating fake anime charactersold artwork or using it to make the President of the United States smile.

The software is available on GitHub, but take note: It requires immense processing power that only top-end graphics processing units (GPUs) or cloud services can deal with.

It’s also kind of creepy.

Story 9:  J.P. Morgan to launch a U.S. dollar-backed cryptocurrency –

JPM Coin, the first of its kind from a major bank, will initially be used to transfer funds over a blockchain network internally and between internationally between institutional clients.

Source:  ComputerWorld Story by Lucas Mearian

Link: https://www.computerworld.com/article/3340373/blockchain/jp-morgan-to-launch-a-us-dollar-backed-cryptocurrency.html

cryptocurrency blockchain digital financial security by just super getty

J.P. Morgan Chase plans to launch what is considered to be the first cryptocurrency backed by a major bank, a move that could legitimize blockchain as a vehicle for fiat cryptocurrencies.

JPM Coin, as the bank is calling its new cryptocoin, is considered fiat currency because it’s backed by U.S. dollars in accounts designated at JPMorgan Chase N.A.

[ Further reading: What is FinTech (and how has it evolved)? ]

One JPM Coin has the equivalent value of one U.S. dollar. Trials for the new cryptocoin are expected to begin in the next few months, according to a CNBC report.

In the crypto industry, an instrument like JPM Coin is known as a “stablecoin” because it has an intrinsic value, unlike Bitcoin or Ethereum’s ETH coins, whose value is based on supply and demand of virtual money.

“When one client sends money to another over the blockchain, JPM Coins are transferred and instantaneously redeemed for the equivalent amount of U.S. dollars, reducing the typical settlement time,” JPMorgan said in an online FAQ. “The JPM Coin is based on blockchain-based technology enabling the instantaneous transfer of payments between institutional accounts.”

In short, JPM Coin is basically a way of using a permissioned blockchain ledger to keep track of balance transfers within the bank’s business and internationally between institutional clients. 

J.P. Morgan clients would purchase JPM Coin, using the tokens in lieu of actual funds to make payments and transfers; JPM would then facilitate the recipient receiving the commensurate number of dollars, according to Dayna Ford, a Gartner research director focused on payments within electronic and mobile commerce.

“As such, if institutional clients would like to move foreign funds into a different institution, it would need to involve [J.P. Morgan] as an intermediary bank,” Ford said. “Once the funds are converted to a local currency within [J.P. Morgan], they would ride existing rails, such as wire or SWIFT, between the two banks in-country, at least initially.”

J.P. Morgan Chase did not respond to Computerworld queries as to whether the bank is considering using a cryptocurrency in its retail business.

Even if only used for its wholesale business, JPM Coin amounts to a public endorsement of distributed ledger technology (DLT) and its practical functionality for business – something enterprises have sought from blockchain-based solutions for years, according to Kevin McMahon, executive director of emerging technologies at digital technology consultancy SPR.

“While the direct impact will be limited to JP Morgan and their institutional clients, the optics and endorsement of the technologies will ripple beyond the financial services industry,” McMahon said, “meaning the outcome of this JPM Coin experiment will be watched closely by those considering distributed ledger technologies for their own purposes.”

McMahon noted nuances with other cryptocurrencies, saying JPM Coin isn’t exactly crypto but a “financial instrument that leverages blockchain technologies.

***Note to listeners: be sure to read the rest of the article