Monday, March 31, 2025

IEEE Women in Engineering Membership on the Rise




It’s been a busy three years since IEEE Women in Engineering celebrated its 25th anniversary in 2022.

WIE facilitates the recruitment and retention of women in technical disciplines around the world. It also works to inspire girls to pursue an engineering career. There are student chapters at universities around the globe. Men may join the affinity group as well.

Women make up less than a third of the world’s workforce in technology-related fields, according to a report by the World Bank Group.

WIE has introduced several programs focused on increasing the number of women in the science, technology, engineering, and math fields. The new initiatives include grants and leadership programs, and several contests have been launched to encourage and support female students who are considering a STEM career.

The group’s hard work is paying off. Winnie Ye, the current WIE chair and an electronics professor at Carleton University, which is just south of Ottawa, reports that, as of February, the number of members and student members was up over the same period last year. There are more than 28,800 members, a year-over-year increase greater than 17 percent. Student membership, at nearly 21,000, is up by roughly the same percentage. There are now more than 1,100 affinity groups.

“Despite this growth, attracting and retaining paying members remains a challenge, particularly as students transition to professional membership,” Ye says. “Ensuring long-term engagement requires demonstrating clear value through career development, networking, and leadership opportunities.”

A day of celebration, grants, and leadership training

Many of the new programs were launched under Celia Shahnaz’s leadership. The WIE committee’s first elected chair, Shahnaz served in 2023 and 2024. In 2022 the committee made the chair an elected position.

Shahnaz has been a WIE volunteer for more than 20 years. She established the first WIE group in the IEEE Bangladesh Section in 2010 and became its first chair. The IEEE senior member is a professor of electrical and electronic engineering at the Bangladesh University of Engineering and Technology, in Dhaka.

She says she is most proud of bringing back the annual IEEE WIE Day, held on 29 June, the date the group was formally established in 1997.

“The day is a member-engaging event targeting membership development and membership retention,” Shahnaz says. Activities include networking events, seminars, and workshops.

This year’s theme is Pioneering Safe Cyberspace: Bridging Technology and Light for Security.

For some groups, one day is not enough, so WIE Day events stretch out over several weeks. The inaugural event held in 2023 featured more than 120 activities worldwide. The number nearly doubled last year, with over 230 events. This year’s celebrations are scheduled to kick off on 23 June, to coincide with International Women in Engineering Day, then end on 7 July.

The IEEE WIE Family Cares grant program was established to financially assist members with caregiving duties so they can attend an IEEE conference. A grant provides up to US $1,000 to cover expenses related to caring for children, senior citizens, and family members who are disabled. The grant is sponsored by the IEEE Foundation.

To keep its members updated on industry trends, as well as research results and their practical applications, WIE has partnered with several IEEE societies and councils to offer a virtual Distinguished Lecturers program. More than 20 groups have provided speakers, including the Computer Society, Power Electronics Society, and Sensors Council.

“I want to really strengthen WIE leadership with the local community so that we can provide more targeted resources and support women.” —Winnie Ye

There’s also the new Industry Experts Network, a global database of industry professionals, researchers, and leaders who can be called upon to participate in an event.

“For any of our events, we can access this list of amazing people, who we can ask to give talks and offer workshops and seminars,” Ye says. Several WIE programs are focused on providing women with leadership skills. Less than 10 percent of women hold positions such as CIO, CTO, and IT manager, or serve as technical team leaders, according to this year’s Women in Tech Stats report.

The WIE International Leadership Summits, which are held around the world, provide opportunities for networking, mentorship, and collaboration. Eight were held last year, and seven are scheduled for this year in countries including Jordan, Pakistan, Poland, and the United States.

The WIE Forum USA East helps its attendees develop and improve their leadership skills through talks by successful leaders. This year’s event is scheduled to be held from 6 to 8 November in Arlington, Va.

Ye says she is most excited about the return of WIE’s International Leadership Conference. After a two-year absence, the event is scheduled to be held on 15 and 16 May in San Jose, Calif. The theme is Accelerating Leadership in an AI-Powered World. Registration is now open.

STEM-related contests

Several contests have been launched to encourage young women to pursue a STEM career, Shahnaz says. Women make up about 35 percent of STEM graduates—a gender disparity that hasn’t changed in the past 10 years, according to UNESCO’s 2024 Global Education Monitoring Report.

Held for the first time in 2023, the WIE Climate Tech Big Idea Pitch competition encourages female engineering students and researchers to be entrepreneurial and is designed to increase the number of technical startups led by women.

“I really wanted to inspire women to build their capacity using their technical knowledge and professional skills to be a startup founder,” Shahnaz says. “We wanted them to come up with a solution to climate-change-related problems and showcase their business ideas and models. We also wanted to nurture the talent of women in all membership grades including senior members, Fellows, and life Fellows.”

In the 2023 contest, a team of engineering students from the Bangladesh University of Engineering and Technology won first place in the impact category for its design of a bamboo filter for diesel engines.

There were 60 submissions that first year, and the number doubled last year. The deadline for submissions for this year’s contest will be announced soon.

Capitalizing on the popularity of manga comics and graphic novels with young people, WIE launched a competition in 2023 to find the best-written manga that centered on Riko-chan, a fictional character who is a preuniversity student. Riko-chan uses STEM tools to help solve everyday problems. More than 40 people participated in the 2023 contest, and there were 81 submissions last year. Their stories are available to read online, and many have been translated into nine languages including French, Hindi, and Spanish.

“We see this contest as an opportunity to showcase the diverse role models in engineering technology,” Ye says. “The goal is to spark curiosity among our younger audience and make STEM fields more relatable and more exciting for them.”

This year’s manga competition is now accepting submissions. Check out the rules and deadlines on the WIE website.

Mentoring and outreach programs

A new mentoring program is in the works, Ye says.

“We want to really create an active mentor-mentee matching and engagement platform within the WIE community,” she says.

As part of her vision for a more engaged and inclusive community, she has launched the WIE Ambassador program. The initiative is designed to empower dedicated WIE members to advocate for IEEE’s mission globally. The ambassadors can promote WIE initiatives, organize local events, and inspire more women to pursue STEM careers, Ye says.

She emphasizes the importance of expanding WIE’s presence in underrepresented regions such as China and South Africa.

“During my term,” she says, “I’m committed to expanding our presence in these regions. I want to really strengthen WIE leadership with the local community so that we can provide more targeted resources and support women. We want to make sure that they are aware of us and become more active.”

Reference: https://ift.tt/WvDS0Lc

Before the Undo Command, There Was the Electric Eraser




I’m fascinated with the early 20th-century zeal for electrifying everyday things. Hand tools, toasters, hot combs—they all obviously benefited from the jolt of electrification. But the eraser? What was so problematic about the humble eraser that it needed electrifying?

A number of things, it turned out. According to Hermann Lukowski in his 1935 patent application for an apparatus for erasing, “Hand held rubbers are clumsy and cover a greater area than may be required.” Aye, there’s the rub, as it were. Lukowski’s cone-tipped electric eraser, he argued, could better handle the fine detail.

An electric eraser could also be a timesaver. In the days before computer-aided drawing and the ease of the delete and undo commands, manually erasing a document could be a delicate operation.

Consider the careful technique Roscoe C. Sloane and John M. Montz suggest in their 1930 book Elements of Topographic Drawing. To make a correction to a map, these civil engineering professors at Ohio State University recommend the following steps:

  1. With a smooth, sharp knife pick the ink from the paper. This can be done without marring the surface.
  2. Place a hard, smooth surface, such as a [drafting] triangle, under the erasure before rubbing starts.
  3. When practically all the ink has been removed with the knife, rub with a pencil eraser.

Erasing was not for the faint of heart!

A Brief History of the Eraser

Where did the eraser get its start? The British scientist Joseph Priestley is celebrated for his discovery of oxygen and not at all celebrated for his discovery of the eraser. Around 1766, while working on The History and Present State of Electricity, he found himself having to draw his own illustrations. First, though, he had to learn to draw, and because any new artist naturally makes mistakes, he also needed to erase.

18th-century portrait of a clean-shaven man wearing a powdered wig, black jacket, and white high-collared shirt. In 1766 or thereabouts, Joseph Priestley discovered the erasing properties of natural rubber.Alamy

Alas, there weren’t a lot of great options for erasing at the time. For items drawn in ink, he could use a knife to scrape away errors; pumice or other rough stones could also be used to abrade the page and remove the ink. To erase pencil, the customary approach was to use a piece of bread or bread crumbs to gently grind the graphite off the page. All of the methods were problematic. Without extreme care, it was easy to damage the paper. Using bread was also messy, and as the writer and artist John Ruskin allegedly said, a waste of perfectly good bread.

As the story goes, Priestley one day accidentally grabbed a piece of caoutchouc, or natural rubber, instead of bread and discovered that it could erase his mistakes.

Priestley may have discovered this attribute of rubber, but Edward Nairne, an inventor, optician, and scientific-instrument maker, marketed it for sale. For three shillings (about one day’s wages for a skilled tradesman), you could purchase a half-inch (1.27-cm) cube of the material. Priestley acknowledged Nairne in the preface of his 1770 tutorial on how to draw, A Familiar Introduction to the Theory and Practice of Perspective, noting that caoutchouc was “excellently adapted to the purpose of wiping from paper the marks of a black-lead-pencil.” By the late 1770s, cubes of caoutchouc were generally known as rubbers or lead-eaters.

Natural rubber might be good for erasing, but it isn’t necessarily an item you want sitting on your desk. It is extremely sensitive to temperature, becoming hard and brittle in the cold and soft and gummy in the heat. Over time, it inevitably degrades. And worst of all, it becomes stinky.

What was so problematic about the humble eraser that it needed electrifying?

Luckily, there were lots of other people looking for ways to improve natural rubber, and in 1839 Charles Goodyear developed the vulcanization process. By adding sulfur to natural rubber and then heating it, Goodyear discovered how to stabilize rubber in a firm state, what we would call today the thermosetting of polymers. In 1844 Goodyear patented a process to create rubber fabric. He went on to make rubber shoes and other products. (The tire company that bears his name was founded by the brothers Charles and Frank Seiberling several decades later.) Goodyear unfortunately died penniless, but we did get a better eraser out of his discovery.

Who Really Invented the Electric Eraser?

Albert Dremel, who opened his eponymous company in 1932, often gets credit for the invention of the electric eraser, but if that’s true, I can find no definitive proof. Out of more than 50 U.S. patents held by Dremel, none are for an electric eraser. In fact, other inventors may have a better claim, such as Homer G. Coy, who filed a patent for an electrified automatic eraser in 1927, or Ola S. Pugerud, who filed a patent for a rotatable electric eraser in 1906.

Photo of an open wooden carrying case containing a black cylinder instrument with its electric cord wrapped around it and a number of small accessories. The Dremel Moto-Tool, introduced in 1935, came with an array of swappable bits. One version could be used as an electric eraser.Dremel

In 1935 Dremel did come out with the Moto-Tool, the world’s first handheld, high-speed rotary tool that had interchangeable bits for sanding, engraving, burnishing, and sharpening. One version of the Moto-Tool was sold as an electric eraser, although it was held more like a hammer than a pencil.

Regardless of who invented the device, electric erasers were definitely on the market by 1929. Some of the earliest adopters were librarians, specifically those who maintained and had to frequently update the card catalog. Margaret Mann, an associate professor of library science at the University of Michigan, listed an electric eraser as recommended equipment in her 1930 book Introduction to Cataloging and the Classification of Books. She described a flat, round rubber eraser mounted on a motor-driven instrument similar to a dentist’s drill. The eraser could remove typewriting and print from catalog cards without leaving a rough appearance. By 1937, discussions of electric erasers were part of the library science curriculum at Columbia University. Electric erasers had gone mainstream.

To erase pencil, the customary approach was to use a piece of bread to gently grind the graphite off the page.

In 1930, the Charles Bruning Co.’s general catalog had six pages of erasers and accessories, with two pages devoted to the company’s electric erasing machine. Bruning, which specialized in engineering, drafting, and surveying supplies, also offered a variety of nonelectrified eraser products, including steel erasers (also known as desk knives), eraser shields (used to isolate the area to be erased), and a chisel-shaped eraser to put on the end of a pencil.

The Loren Specialty Manufacturing Co. arrived late to the electric eraser game, introducing its first such product in 1953. Held in the hand like a pen or pencil, the Presto electric eraser would vibrate to abrade a small area in need of correction. The company spun off the Presto brand in 1962, about the time the Presto Model 80 [shown at top] was produced. This particular unit was used by officer workers at the New York Life Insurance Co. and is now housed at the Smithsonian’s Cooper Hewitt.

The Creativity of the Eraser

When I was growing up, my dad kept an electric eraser next to his drafting table. I loved playing with it, but it wasn’t until I began researching this article that I realized I had been using it all wrong. The pros know you’re supposed to shape the cylindrical rubber into a point in order to erase fine lines.

Today almost all draftsmen, librarians, and office workers have gone digital, but some visual artists still use electric erasers. One of them is artist and educator Darrel Tank, who specializes in pencil drawings. I watched several of his surprisingly fascinating videos comparing various models of electric erasers. Seeing Tank use his favorite electric eraser to create texture on clothing or movement in hair made me realize that drawing is not just an additive process. Sometimes it is what’s removed that makes the difference.

- YouTube

Susan Piedmont-Palladino, an architect and professor at Virginia Tech’s Washington-Alexandria Architecture Center, has also thought a lot about erasing. She curated the exhibit “Tools of the Imagination: Drawing Tools and Technologies from the Eighteenth Century to the Present” at the National Building Museum in 2005 and authored the companion book of the same title. Piedmont-Palladino describes architectural design as a long process of doing, undoing, and redoing, deciding which ideas can stay and which must go.

Piedmont-Palladino writes lovingly of a not-too-distant past, where this design process was captured in a building’s plans. When the architect was drafting by hand, the paper itself became a record of distress, showing where it had been scraped, erased, and redrawn. You could see points of uncertainty and points of decisiveness. But today, when almost all architectural drawing is done on a computer, users delete instead of erase. With a few keystrokes, an object can disappear, move to another spot, or miraculously reappear from the trash can. The design process is no longer etched into the page.

Of course, the pencil, the eraser (electric or not), and the computer are all just tools for transmitting and visualizing ideas. The tools of any age reflect society in ways that aren’t always clear until new tools come to replace them. Both the pencil and the eraser had to be invented, and it is up to historians to make sure they aren’t forgotten.

Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology.

An abridged version of this article appears in the April 2025 print issue as “When Electrification Came for the Eraser.”

References


The electric eraser, more than any object I have researched for Past Forward, has the most incorrect information about its history on the Internet—wrong names, bad dates, inaccurate assertions—which get repeated over and over again as fact. It’s a great reminder of the need to go back to original sources.

As always, I enjoyed digging through patents to trace the history of invention and innovation in electric erasers.

Other primary sources I consulted include Margaret Mann’s Introduction to Cataloging and the Classification of Books, a syllabus to Columbia University’s 1937 course on Library Service 201, and the Charles Bruning Co.’s 1930 catalog.

Although Henry Petroski’s The Pencil: A History of Design and Circumstance only has a little bit of information on the history of erasers, it’s a great read about the implement that does the writing that needs to be erased.

Reference: https://ift.tt/o9b3jpn

Sunday, March 30, 2025

"The Doctor Will See Your Electronic Health Record Now"




Cheryl Conrad no longer seethes with the frustration that threatened to overwhelm her in 2006. As described in IEEE Spectrum, Cheryl’s husband, Tom, has a rare genetic disease that causes ammonia to accumulate in his blood. At an emergency room visit two decades ago, Cheryl told the doctors Tom needed an immediate dose of lactulose to avoid going into a coma, but they refused to medicate him until his primary doctor confirmed his medical condition hours later.

Making the situation more vexing was that Tom had been treated at that facility for the same problem a few months earlier, and no one could locate his medical records. After Tom’s recovery, Cheryl vowed to always have immediate access to them.

Today, Cheryl says, “Happily, I’m not involved anymore in lugging Tom’s medical records everywhere.” Tom’s two primary medical facilities use the same electronic health record (EHR) system, allowing doctors at both facilities to access his medical information quickly.

In 2004, President George W. Bush set an ambitious goal for U.S. health care providers to transition to EHRs by 2014. Electronic health records, he declared, would transform health care by ensuring that a person’s complete medical information was available “at the time and place of care, no matter where it originates.”

U.S. President George Bush stands next to a doctor in a white lab coat, as he points at a screen with an electronic health record on it. President George W. Bush looks at an electronic medical record system during a visit to the Cleveland Clinic on 27 January 2005. Brooks Kraft/Corbis/Getty Images

Over the next four years, a bipartisan Congress approved more than US $150 million in funding aimed at setting up electronic health record demonstration projects and creating the administrative infrastructure needed.

Then, in 2009, during efforts to mitigate the financial crisis, newly elected President Barack Obama signed the $787 billion economic stimulus bill. Part of it contained the Health Information Technology for Economic and Clinical Health Act, also known as the HITECH Act, which budgeted $49 billion to promote health information technology and EHRs in the United States.

As a result, Tom, like most Americans, now has an electronic health record. However, many millions of Americans now have multiple electronic health records. On average, patients in the United States visit 19 different kinds of doctors throughout their lives. Further, many specialists have unique EHR systems that do not automatically communicate medical data between each other, so you must update your medical information for each one. Nevertheless, Tom now has immediate access to all his medical treatment and test information, something not readily available 20 years ago.

Tom’s situation underlines the paradox of how far the United States has come since 2004 and how far it still must go to achieve President Bush’s vision of a complete, secure, easily accessible, and seamlessly interoperable lifetime EHR.


As of 2021, nearly 80 percent of physicians and almost all nonfederal acute-care hospitals deployed an electronic health record system.


For many patients in the United States today, instead of fragmented, paper medical record silos, they have a plethora of fragmented, electronic medical record silos. And thousands of health care providers are burdened with costly, poorly designed, and insecure EHR systems that have exacerbated clinician burnout, led to hundreds of millions of medical records lost in data breaches, and created new sources of medical errors.

EHR’s baseline standardization does help centralize a very fragmented health care system, but in the rush to get EHR systems adopted, key technological and security challenges were overlooked and underappreciated. Subsequently, problems were introduced due to the sheer complexity of the systems being deployed. These still-unresolved issues are now potentially coupled with the unknown consequences of bolting on immature AI-driven technologies. Unless more thought and care are taken now in how to proceed as a fully integrated health care system, we could unintentionally put the entire U.S. health care system in a worse place than when President Bush first declared his EHR goal in 2004.

IT to Correct Health Care Inefficiencies Is a Global Project

Putting government pressure on the health care industry to adopt EHR systems through various financial incentives made sense by the early 2000s. Health care in the United States was in deep trouble. Spending increased from $74.1 billion in 1970 to more than $1.4 trillion by 2000, 2.3 times as fast as the U.S. gross domestic product. Health care costs grew at three times the rate of inflation from 1990 to 2000 alone, surpassing 13 percent of GDP.

Two major studies conducted by the Institute of Medicine in 2000 and 2001, titled To Err Is Human and Crossing the Quality Chasm, found that health care was deteriorating in terms of accessibility, quality, and safety. Inferior quality and needless medical treatments, including overuse or duplication of diagnostic tests, underuse of effective medical practices, misuse of drug therapies, and poor communication between health care providers emerged as particularly frustrating problems.

Administrative waste and unnecessary expenditures were substantial cost drivers, from billing to resolving insurance claims to managing patients’ cases. Health care’s administrative side was characterized as a “ monstrosity,” showing huge transaction costs associated with an estimated 30 billion communications conducted by mail, fax, or telephone annually at that time.

Both health care experts and policymakers agreed that reductions in health care delivery and its costs were possible only by deploying health information technology such as electronic prescribing and EHR. Early adopters of EHR systems like the Mayo Clinic, Cleveland Clinic, and the U.S. Department of Veterans Affairs proved the case. Governments across the European Union and the United Kingdom reached the same conclusion.

There has been a consistent push, especially in more economically advanced countries, to adopt EHR systems over the past two decades. For example, the E.U. has set a goal of providing 100 percent of its citizens across 27 countries access to electronic health records by 2030. Several countries are well on their way to this achievement, including Belgium, Denmark, Estonia, Lithuania, and Poland. Outside the E.U., countries such as Israel and Singapore also have very advanced systems, and after a rocky start, Australia’s My Health Record system seems to have found its footing. The United Kingdom was hoping to be a global leader in adopting interoperable health information systems, but a disastrous implementation of its National Programme for IT ended in 2011 after nine years and more than £10 billion. Canada, China, India, and Japan also have EHR system initiatives in place at varying levels of maturity. However, it will likely be years before they achieve the same capabilities found in leading digital-health countries.

EHRs Need a Systems-Engineering Approach

When it comes to embracing automation, the health care industry has historically moved at a snail’s pace, and when it does move, money goes to IT automation first. Market forces alone were unlikely to speed up EHR adoption.

Even in the early 2000s, health care experts and government officials were confident that digitalization could reduce total health spending by 10 percent while improving patient care. In a highly influential 2005 study, the RAND Corp. estimated that adopting EHR systems in hospitals and physician offices would cost $98 billion and $17 billion, respectively. The report also estimated that these entities would save at least $77 billion a year after moving to digital records. A highly cited paper in HealthAffairs from 2005 also claimed that small physician practices could recoup their EHR system investments in 2.5 years and profit handsomely thereafter.

Moreover, RAND claimed that a fully automated health care system could save the United States $346 billion per year. When Michael O. Leavitt, then the Secretary of Health and Human Services, looked at the projected savings, he saw them as “a key part of saving Medicare.” As baby boomers began retiring en masse in the early 2010s, cutting health care costs was also a political imperative since Medicare funding was projected to run out by 2020.

Some doubted the EHR revolution’s health care improvement and cost reduction claims or that it could be achieved within 20 years. The Congressional Budget Office argued that the RAND report overstated the potential costs and benefits of EHR systems and ignored peer-reviewed studies that contradicted it. The CBO also pointed out that RAND assumed EHR systems would be widely adopted and effectively used, which implies that effective tools already existed, though very few commercially available systems were. There was also skepticism about whether replicating the benefits for early adopters of EHR systems—who spent decades perfecting their systems—was possible once the five-year period of governmental EHR adoption incentives ended.

Even former House Speaker Newt Gingrich, a strong advocate for electronic health record systems, warned that health care was “30 times more difficult to fix than national defense.” The extent of the problem was one reason the 2005 National Academy of Sciences report, Building a Better Delivery System: A New Engineering / Health Care Partnership, forcefully and repeatedly called for innovative systems-engineering approaches to be developed and applied across the entire health care delivery process. The scale, complexity, and extremely short time frame for attempting to transform the totality of the health care environment demanded a robust “system of systems” engineering approach.

This was especially true because of the potential human impacts of automation on health care professionals and patients. Researchers warned that ignoring the interplay of computer-mediated work and existing sociotechnical conditions in health care practices would result in unexpected, unintentional, and undesirable consequences.

Additionally, without standard mechanisms for making EHR systems interoperable, many potential benefits would not materialize. As David Brailer, the first National Health Information Technology Coordinator, stated, “Unless interoperability is achieved…potential clinical and economic benefits won’t be realized, and we will not move closer to badly needed health care reform in the U.S.”

HITECH’s Broken Promises and Unforeseen Consequences

A few years later, policymakers in the Obama administration thought it was unrealistic to prioritize interoperability. They feared that defining interoperability standards too early would lock the health industry into outdated information-sharing approaches. Further, no existing health care business model supported interoperability, and a strong business model actively discouraged providers from sharing information. If patient information could easily shift to another provider, for example, what incentive does the provider have to readily share it?

Instead, policymakers decided to have EHR systems adopted as widely and quickly as possible during the five years of HITECH incentives. Tackling interoperability would come later. The government’s unofficial operational mantra was that EHR systems needed to become operational before they could become interoperable.

“Researchers have found that doctors spend between 3.5 and 6 hours a day (4.5 hours on average) filling out their digital health records.”

Existing EHR system vendors, making $2 billion annually at the time, viewed the HITECH incentive program as a once-in-a-lifetime opportunity to increase market share and revenue streams. Like fresh chum to hungry sharks, the subsidy money attracted a host of new EHR technology entrants eager for a piece of the action. The resulting feeding frenzy pitted an IT-naïve health care industry rushing to adopt EHR systems against a horde of vendors willing to promise (almost) anything to make a sale.

A few years into the HITECH program, a 2013 report by RAND wryly observed the market distortion caused by what amounted to an EHR adoption mandate: “We found that (EHR system) usability represents a relatively new, unique, and vexing challenge to physician professional satisfaction. Few other service industries are exposed to universal and substantial incentives to adopt such a specific, highly regulated form of technology, which has, as our findings suggest, not yet matured.”

In addition to forcing health care providers to choose quickly among a host of immature EHR solutions, the HITECH program completely undercut the warnings raised about the need for systems engineering or considering the impact of automation on very human-centered aspects of health care delivery by professionals. Sadly, the lack of attention to these concerns affects current EHR systems.

Today, studies like that conducted by Stanford Medicine indicate that nearly 70 percent of health care professionals express some level of satisfaction with their electronic health record system and that more than 60 percent think EHR systems have improved patient care. Electronic prescribing has also been seen as a general success, with the risk of medication errors and adverse drug events reduced.

However, professional satisfaction with EHRs runs shallow. The poor usability of EHR systems surfaced early in the HITECH program and continues as a main driver for physician dissatisfaction. The Stanford Medicine study, for example, also reported that 54 percent of physicians polled felt their EHR systems detracted from their professional satisfaction, and 59 percent felt it required a complete overhaul.


“What we’ve essentially done is created 24/7/365 access to clinicians with no economic model for that: The doctors don’t get paid.” —Robert Wachter, chair of the department of medicine at the University of California, San Francisco


Poor EHR system usability results in laborious and low-value data entry, obstacles to face-to-face patient communication, and information overload, where clinicians have to wade through an excess of irrelevant data when treating a patient. A 2019 study in Mayo Clinic Proceedings comparing EHR system usability to other IT products like Google Search, Microsoft Word, and Amazon placed EHR products in the bottom 10 percent.

Electronic health record systems were supposed to increase provider productivity, but for many clinicians, their EHRs are productivity vampires instead. Researchers have found that doctors spend between 3.5 and 6 hours a day (4.5 hours on average) filling out their patient’s digital health records, with an Annals of Internal Medicine study reporting that doctors in outpatient settings spend only 27 percent of their work time face-to-face with their patients.

In those visits, patients often complain that their doctors spend too much time staring at their computers. They are not likely wrong, as nearly 70 percent of doctors in 2018 felt that EHRs took valuable time away from their patients. To address this issue, health care providers employ more than 100,000 medical scribes today—or about one for every 10 U.S. physicians—to record documentation during office visits, but this only highlights the unacceptable usability problem.

Furthermore, physicians are spending more time dealing with their EHRs because the government, health care managers, and insurance companies are requesting more patient information regarding billing, quality measures, and compliance data. Patient notes are twice as long as they were 10 years ago. This is not surprising, as EHR systems so far have not complemented clinician work as much as directed it.

“A phenomenon of the productivity vampire is that the goalposts get moved,” explains University of Michigan professor emeritus John Leslie King, who coined the phrase “productivity vampire.” King, a student of system–human interactions, continues, “With the ability to better track health care activities, more government and insurance companies are going to ask for that information in order for providers to get paid.”


Portrait of a silver-haired bespectacled man in a white Dr. coat

Robert Wachter, chair of the department of medicine at the University of California, San Francisco, and author of The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age, believes that EHRs “became an enabler of corporate control and outside entity control.”

“It became a way that entities that cared about what the doctor was doing could now look to see in real time what the doctor was doing, and then influence what the doctor was doing and even constrain it,” Wachter says.

Federal law mandates that patients have access to their medical information contained in EHR systems—which is great, says Wachter, but this also adds to clinician workloads, as patients now feel free to pepper their physicians with emails and messages about the information.

“What we’ve essentially done is created 24/7/365 access to clinicians with no economic model for that: The doctors don’t get paid,” Wachter says. His doctors’ biggest complaints are that their EHR system has overloaded email inboxes with patient inquiries. Some doctors report that their in-boxes have become the equivalent of a second set of patients.

It is not so much a problem with the electronic information system design per se, notes Wachter, but with EHR systems that “meet the payment system and the workflow system in ways that we really did not think about.” EHRs also promised to reduce stress among health care professionals. Numerous studies have found, however, that EHR systems worsen clinician burnout, with Stanford Medicine finding that 71 percent of physicians felt the systems contributed to burnout.


Half of U.S. physicians are experiencing burnout, with 63 percent reporting at least one manifestation in 2022. The average physician works 53 hours weekly (19 hours more than the general population) and spends over 4 hours daily on documentation.


Clinical burnout is lowest among clinicians with highly usable EHR systems or in specialties with the least interaction with their EHR systems, such as surgeons and radiologists. Physicians who make, on average, 4,000 EHR system clicks per shift, like emergency room doctors, report the highest levels of burnout.

Aggravating the situation, notes Wachter, was “that decision support is so rudimentary…which means that the doctors feel like they’re spending all this time entering data in the machine, (but) getting relatively little useful intelligence out of it.”

Poorly designed information systems can also compromise patient safety. Evidence suggests that EHR systems with unacceptable usability contribute to low-quality patient care and reduce the likelihood of catching medical errors. According to a study funded by the U.S. Agency for Healthcare Research and Quality, EHR system issues were involved in the majority of malpractice claims over a six-and-a-half-year period of study ending in 2021. Sadly, the situation has not changed today.

Interoperability, Cybersecurity Bite Back

EHR system interoperability closely follows poor EHR system usability as a driver of health care provider dissatisfaction. Recent data from the Assistant Secretary for Technology Policy / Office of the National Coordinator for Health Information Technology indicates that 70 percent of hospitals sometimes exchange patient data, though only 43 percent claim they regularly do. System-affiliated hospitals share the most information, while independent and small hospitals share the least.

Exchanging information using the same EHR system helps. Wachter observes that interoperability among similar EHR systems is straightforward, but across different EHR systems, he says, “it is still relatively weak.”

However, even if two hospitals use the same EHR vendor, communicating patient data can be difficult if each hospital’s system is customized. Studies indicate that patient mismatch rates can be as high as 50 percent, even in practices using the same EHR vendor. This often leads to duplicate patient records that lack vital patient information, which can result in avoidable patient injuries and deaths.

The ability to share information associated with a unique patient identifier (UPI), like other countries that use advanced EHRs, including Estonia, Israel, and Singapore, makes health information interoperability easier, says Christina Grimes, digital health strategist for the Healthcare Information and Management Systems Society (HIMSS).

But in the United States, “Congress has forbidden it since 1998” and steadfastly resists allowing for UPIs, she notes.

Using a single-payer health insurance system, like most other countries with advanced EHR systems, would also make sharing patient information easier, decrease time spent on EHRs, and reduce clinician burnout, but that is also a nonstarter in the United States for the foreseeable future.

Interoperability is even more challenging because an average hospital uses 10 different EHR vendors internally to support more than a dozen different health care functions, and an average health system has 16 different EHR vendors when affiliated providers are included. Grimes notes that only a small percentage of health care systems use fully integrated EHR systems that cover all functions.

EHR systems adoption also promised to bend the national health care cost curve, but these costs continue to rise at the national level. The United States spent an estimated $4.8 trillion on health care in 2023, or 17.6 percent of GDP. While there seems to be general agreement that EHRs can help with cost savings, no rigorous quantitative studies at the national level show the tens of billions of dollars of promised savings that RAND loudly proclaimed in 2005.

However, studies have shown that health care providers, especially those in rural areas, have had difficulty saving money by using EHR systems. A recent study, for example, points out that rural hospitals do not benefit as much from EHR systems as urban hospitals in terms of reducing operating costs. With 700 rural hospitals at risk of closing due to severe financial pressures, investing in EHR systems has not proved to be the financial panacea they thought it would be.

Cybersecurity is a major cost not included in the 2005 RAND study. Even though there were warnings that cybersecurity was being given short shrift, vendors, providers, and policymakers paid scant attention to the cybersecurity implications of EHR systems, especially the multitude of new cyberthreat access points that would be created and potentially exploited. Tom Leary, senior vice president and head of government relations at HIMSS, points out the painfully obvious fact that “security was an afterthought. You have to make sure that security by design is involved from the beginning, so we’re still paying for the decision not to invest in security.”

From 2009 to 2023, a total of 5,887 health care breaches of 500 records or more have been reported to the U.S. Department of Health and Human Services Office for Civil Rights resulting in some 520 million health care records being exposed. Health care breaches have also led to widespread disruption to medical care in various hospital systems, sometimes for over a month.


In 2024, the average cost of a health care data breach was $9.97 million. The cost of these breaches will soon surpass the $27 billion ($44.5 billion in 2024 dollars) provided under HITECH to adopt EHRs.

2025 may see the first major revision since 2013 to the Health Insurance Portability and Accountability Act (HIPAA) Security Rule outlining how electronic protected health information will need to be cybersecured. The proposed rule will likely force health care providers and their EHR vendors to make cybersecurity investment a much higher priority.

$100 Billion Spent on Health Care IT: Was the Juice Worth the (Mega) Squeeze?

The U.S. health care industry has spent more than $100 billion on information technology, but few providers are fully meeting President Bush’s vision of a nation of seamlessly interoperable and secure digital health records.

Many past government policymakers now admit they failed to understand the complex business dynamics, technical scale, complexity, or time needed to create a nationwide system of usable, interoperable EHR systems. The entire process lacked systems-engineering thinking. As Seema Verma, former administrator of the Centers for Medicare and Medicaid Services, told Fortune, “We didn’t think about how all these systems connect with one another. That was the real missing piece.”

Over the past eight years, successive administrations and congresses have taken actions to try to rectify these early oversights. In 2016, the 21st Century Cures Act was passed, which kept EHR system vendors and providers from blocking the sharing of patient data, and spurred them to start working in earnest to create a trusted health information exchange. The Cures Act mandated standardized application programming interfaces (APIs) to promote interoperability. In 2022, the Trusted Exchange Framework and Common Agreement (TEFCA) was published, which aims to facilitate technical principles for securely exchanging health information.

“The EHR venture has proved troublesome thus far. The trouble is far from over.” —John Leslie King, University of Michigan professor emeritus

In late 2023, the first Qualified Health Information Networks (QHINs) were approved to begin supporting the exchange of data governed by TEFCA, and in 2024, updates were made to the APIs to make information interoperability easier. These seven QHINs allow thousands of health providers to more easily exchange information. Combined with the emerging consolidation among hospital systems around three EHR vendors—Epic Systems Corp., Oracle Health, and Meditechthis should improve interoperability in the next decade.

These changes, says HIMSS’s Tom Leary, will help give “all patients access to their data in whatever format they want with limited barriers. The health care environment is starting to become patient-centric now. So, as a patient, I should soon be able to go out to any of my healthcare providers to really get that information.”

HIMSS’s Christina Grimes adds that the patient-centric change is the continuing consolidation of EHR system portals. “Patients really want one portal to interact with instead of the number they have today,” she says.

In 2024, the Assistant Secretary for Technology Policy / Office of the National Coordinator for Health IT, the U.S. government department responsible for overseeing electronic health systems’ adoption and standards, was reorganized to focus more on cybersecurity and advanced technology like AI. In addition to the proposed HIPAA security requirements, Congress is also considering new laws to mandate better cybersecurity. There is hope that AI can help overcome EHR system usability issues, especially clinician burnout and interoperability issues like patient matching.

Wachter states that the new AI scribes are showing real promise. “The way it works is that I can now have a conversation with my patient and look the patient in the eye. I’m actually focusing on them and not my keyboard. And then a note, formatted correctly, just magically appears. Almost ironically, this new set of AI technologies may well solve some of the problems that the last technology created.”

Whether these technologies live up to the hype remains to be seen. More concerning is whether AI will exacerbate the rampant feeling among providers that they have become tools of their tools and not masters of them.

As EHR systems become more usable, interoperable, and patient-friendly, the underlying foundations of medical care can be finally addressed. High-quality evidence backs only about 10 percent of the care patients receive today. One of the great potentials of digitizing health records is to discover what treatments work best and why and then distribute that information to the health care community. While this is an active research area, more research and funding are needed.

Twenty years ago, Tom Conrad, who himself was a senior computer scientist, told me he was skeptical that having more information necessarily meant that better medical decisions would automatically be made. He pointed out that when doctors’ earnings are related to the number of patients they see, there is a trade-off between the better care that EHR provides and the sheer amount of time required to review a more complete medical record. Today, the trade-off is not in the patients’ or doctors’ favor. Whether it can ever be balanced is one of the great unknowns.

Obviously, no one wants to go back to paper records. However, as John Leslie King says, “The way forward involves multiple moving targets due to advances in technology, care, and administration. Most EHR vendors are moving as fast as they can.”

However, it would be foolish to think it will be smooth sailing from here on, King says: “The EHR venture has proved troublesome thus far. The trouble is far from over.”

Reference: https://ift.tt/dfEYpaG

Saturday, March 29, 2025

What could possibly go wrong? DOGE to rapidly rebuild Social Security codebase.


The so-called Department of Government Efficiency (DOGE) is starting to put together a team to migrate the Social Security Administration’s (SSA) computer systems entirely off one of its oldest programming languages in a matter of months, potentially putting the integrity of the system—and the benefits on which tens of millions of Americans rely—at risk.

The project is being organized by Elon Musk lieutenant Steve Davis, multiple sources who were not given permission to talk to the media tell WIRED, and aims to migrate all SSA systems off COBOL, one of the first common business-oriented programming languages, and onto a more modern replacement like Java within a scheduled tight timeframe of a few months.

Under any circumstances, a migration of this size and scale would be a massive undertaking, experts tell WIRED, but the expedited deadline runs the risk of obstructing payments to the more than 65 million people in the US currently receiving Social Security benefits.

Read full article

Comments

Reference : https://ift.tt/ykr2O9Q

This Solar Engineer Is Patching Lebanon’s Power Grid




In Mira Daher’s home country of Lebanon, the national grid provides power for only a few hours a day. The country’s state-owned energy provider, Electricity of Lebanon (EDL), has long struggled to meet demand, and a crippling economic crisis that began in 2019 has worsened the situation. Most residents now rely on privately owned diesel-powered generators for the bulk of their energy needs.

But in recent years, the rapidly falling cost of solar panels has given Lebanese businesses and families a compelling alternative, and the country has seen a boom in private solar-power installations. Total installed solar capacity jumped nearly eightfold between 2020 and 2022 to more than 870 megawatts, primarily as a result of off-grid rooftop installations.

Daher, head of tendering at the renewable-energy company Earth Technologies, in Antelias, Lebanon, has played an important part in this ongoing revolution. She is in charge of bidding for new solar projects, drawing up designs, and ensuring that they are correctly implemented on-site.

“I enjoy the variety and the challenge of managing diverse projects, each with its own unique requirements and technical hurdles,” she says. “And knowing that my efforts also contribute to a sustainable future for Lebanon fills me with pride and motivates me a lot.”

An Early Mentor

Daher grew up in the southern Lebanese city of Saida (also called Sidon) where her father worked as an electrical engineer in the construction sector. His work helped to inspire her interest in technology at a young age, she says. When she was applying for university, he encouraged her to study electrical engineering too.

“My first mentor was my father,” says Daher. “He increased my curiosity and passion for technology and engineering, and when I watched him work and solve complex problems, that motivated me to follow in his footsteps.”

In 2016, she enrolled at Beirut Arab University to study electrical and electronics engineering. When she graduated in 2019, Daher says, the country’s solar boom was just taking off, which prompted her to pursue a master’s degree in power and energy, with a specialization in solar power, at the American University of Beirut.

“My thesis concentrated on the energy situation in Lebanon and explored potential solutions to increase the reliance on renewable resources,” she says. “Five or six years ago, solar systems had high costs. But today the cost [has] decreased a lot because of new technologies, and because there is a lot of production of solar panels in China.”

Entering the Workforce

After graduating in 2021, Daher started a job as a solar-energy engineer at the Beirut-based solar-power company Mashriq Energy, where she was responsible for developing designs and bids for new solar installations, similar to her current role. It was a steep learning curve, Daher says, because she had to quickly pick up business skills, including financial modeling and contract negotiations. She also learned to deal with the practicalities of building large solar developments, such as site constraints and regulations. In 2022, she joined Earth Technologies as a solar project design engineer.

Various organizations, including Lebanese government and nongovernmental agencies such as the United Nations, request bids for solar power installations they want to build—a process known as tendering. Daher’s principal responsibility is to prepare and submit bids for these projects, but she also supervises their implementation.

Person standing on rooftop above parking area with solar panels, ocean and trees in the background. Daher’s role requires her to maintain a broad base of knowledge about the solar projects she oversees.Mira Daher

“I oversee the entire project cycle, from identifying and managing tenders to designing, pricing, and implementing solar projects across residential, industrial, commercial, and utility sectors,” she says.

The first step in the process is to visit the proposed installation site to determine where solar panels should be positioned based on the landscape and local weather conditions. Once this is done, Daher and her team come up with a design for the plant. This involves figuring out what kinds of solar panels, inverters, and batteries will fit the budget and how to wire all the components together.

The team runs simulations of the proposed plant to ensure that the design meets the client’s needs. Daher is then responsible for negotiating with the client to make sure that the proposal fulfills their technical and budgetary requirements. Once the client has approved the design, other teams oversee construction of the plant, though Daher says she makes occasional site visits to ensure the design is being implemented correctly.

Daher’s role requires her to have a solid understanding of all the components that go into a solar plant, from the different brands of power electronics to the civil engineering required to build supporting structures for solar panels. “You have to know everything about the project,” she says.

Solar Power for Development

Earth Technologies operates across the Middle East and Africa, but Daher says most of the solar installations she works on are in Lebanon. Some of the most interesting have been development-focused projects funded by the U.N.

Daher led the U.N.-funded installation of solar panels at nine hospitals, as well as a project that uses solar power to pump water to people in remote parts of the country. More recently, she has started work on a solar and battery installation for street lighting in the town of Bourj Hammoud, which will allow shops to stay open later and help to boost the local economy. The projects she has overseen generally cost around US $700,000 to $800,000.

But securing funding for renewable projects is an ongoing challenge in Lebanon, says Daher, given the uncertain economic situation. More recently, the country was also rocked by the conflict between Israel and the Lebanon-based paramilitary group Hezbollah. This resulted in widespread bombing of Beirut, the capital, and the country’s southern regions last October and November.

“The two months of conflict were incredibly challenging,” says Daher. “The environment was unsafe and filled with uncertainty, leaving us constantly anxious about what the future held.”

Safety concerns forced her to relocate from her home in Beirut to a village called Ain El Jdideh. This meant she had to drive about an hour and a half on unsafe roads to get to work. Several of the major projects she was working on were also halted as they were in the areas that bore the brunt of the conflict. One U.N.-funded project she worked on in Ansar, in southern Lebanon, was knocked offline when an adjacent building was destroyed.

“Despite these hardships, we persevered, and I am grateful that the war has ended, allowing us to regain some stability and security,” says Daher.

A Challenging But Fulfilling Career

Despite these difficulties, Daher remains optimistic about the future of renewable energy in Lebanon, and she says it can be a deeply rewarding career. Breaking into the industry requires a strong educational foundation, though, so she recommends first pursuing a degree focused on power systems and renewable technologies.

The energy sector is a male-dominated field, says Daher, which can make it difficult for women to find their footing. “I’ve often encountered biases, stereotypes that can make it more difficult to be taken seriously, or to have my voice heard,” she adds. “Overcoming these obstacles requires resilience, confidence, and a commitment to demonstrating my expertise and capabilities.”

It also requires a commitment to continual learning, due to the continued advances being made in solar-power technology. “It’s very important to stay up to date,” she says. “This field is always evolving. Every day, you can see a lot of new technologies.”

Reference: https://ift.tt/6yztBYb

Friday, March 28, 2025

Scientists are storing light we cannot see in formats meant for human eyes


Imagine working with special cameras that capture light your eyes can't even see—ultraviolet rays that cause sunburn, infrared heat signatures that reveal hidden writing, or specific wavelengths that plants use for photosynthesis. Or perhaps using a special camera designed to distinguish the subtle visible differences that make paint colors appear just right under specific lighting. Scientists and engineers do this every day, and they're drowning in the resulting data.

A new compression format called Spectral JPEG XL might finally solve this growing problem in scientific visualization and computer graphics. Researchers Alban Fichet and Christoph Peters of Intel Corporation detailed the format in a recent paper published in the Journal of Computer Graphics Techniques (JCGT). It tackles a serious bottleneck for industries working with these specialized images. These spectral files can contain 30, 100, or more data points per pixel, causing file sizes to balloon into multi-gigabyte territory—making them unwieldy to store and analyze.

When we think of digital images, we typically imagine files that store just three colors: red, green, and blue (RGB). This works well for everyday photos, but capturing the true color and behavior of light requires much more detail. Spectral images aim for this higher fidelity by recording light's intensity not just in broad RGB categories, but across dozens or even hundreds of narrow, specific wavelength bands. This detailed information primarily spans the visible spectrum and often extends into near-infrared and near-ultraviolet regions crucial for simulating how materials interact with light accurately.

Read full article

Comments

Reference : https://ift.tt/LHo79f4

Oracle has reportedly suffered 2 separate breaches exposing thousands of customers‘ PII


Oracle isn’t commenting on recent reports that it has experienced two separate data breaches that have exposed sensitive personal information belonging to thousands of its customers.

The most recent data breach report, published Friday by Bleeping Computer, said that Oracle Health—a health care software-as-a-service business the company acquired in 2022—had learned in February that a threat actor accessed one of its servers and made off with patient data from US hospitals. Bleeping Computer said Oracle Health customers have received breach notifications that were printed on plain paper rather than official Oracle letterhead and were signed by Seema Verma, the executive vice president & GM of Oracle Health.

The other report of a data breach occurred eight days ago, when an anonymous person using the handle rose87168 published a sampling of what they said were 6 million records of authentication data belonging to Oracle Cloud customers. Rose87168 told Bleeping Computer that they had acquired the data a little more than a month earlier after exploiting a vulnerability that gave access to an Oracle Cloud server.

Read full article

Comments

Reference : https://ift.tt/QpxMRXL

Video Friday: Watch this 3D-Printed Robot Escape




Your weekly selection of awesome robot videos

Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.

RoboSoft 2025: 23–26 April 2025, LAUSANNE, SWITZERLAND
ICUAS 2025: 14–17 May 2025, CHARLOTTE, NC
ICRA 2025: 19–23 May 2025, ATLANTA, GA
London Humanoids Summit: 29–30 May 2025, LONDON
IEEE RCAR 2025: 1–6 June 2025, TOYAMA, JAPAN
2025 Energy Drone & Robotics Summit: 16–18 June 2025, HOUSTON, TX
RSS 2025: 21–25 June 2025, LOS ANGELES
ETH Robotics Summer School: 21–27 June 2025, GENEVA
IAS 2025: 30 June–4 July 2025, GENOA, ITALY
ICRES 2025: 3–4 July 2025, PORTO, PORTUGAL
IEEE World Haptics: 8–11 July 2025, SUWON, KOREA
IFAC Symposium on Robotics: 15–18 July 2025, PARIS
RoboCup 2025: 15–21 July 2025, BAHIA, BRAZIL
RO-MAN 2025: 25–29 August 2025, EINDHOVEN, NETHERLANDS

Enjoy today’s videos!

This robot can walk, without electronics, and only with the addition of a cartridge of compressed gas, right off the 3D-printer. It can also be printed in one go, from one material. Researchers from the University of California San Diego and BASF, describe how they developed the robot in an advanced online publication in the journal Advanced Intelligent Systems. They used the simplest technology available: a desktop 3D-printer and an off-the-shelf printing material. This design approach is not only robust, it is also cheap—each robot costs about $20 to manufacture.

And details!

[ Paper ] via [ University of California San Diego ]

Why do you want a humanoid robot to walk like a human? So that it doesn’t look weird, I guess, but it’s hard to imagine that a system that doesn’t have the same arrangement of joints and muscles that we do will move optimally by just trying to mimic us.

[ Figure ]

I don’t know how it manages it, but this little soft robotic worm somehow moves with an incredible amount of personality.

Soft actuators are critical for enabling soft robots, medical devices, and haptic systems. Many soft actuators, however, require power to hold a configuration and rely on hard circuitry for control, limiting their potential applications. In this work, the first soft electromagnetic system is demonstrated for externally-controlled bistable actuation or self-regulated astable oscillation.

[ Paper ] via [ Georgia Tech ]

Thanks, Ellen!

A 180-degree pelvis rotation would put the “break” in “breakdancing” if this were a human doing it.

[ Boston Dynamics ]

My colleagues were impressed by this cooking robot, but that may be because journalists are always impressed by free food.

[ Posha ]

This is our latest work about a hybrid aerial-terrestrial quadruped robot called SPIDAR, which shows unique and complex locomotion styles in both aerial and terrestrial domains including thrust-assisted crawling motion. This work has been presented in the International Symposium of Robotics Research (ISRR) 2024.

[ Paper ] via [ Dragon Lab ]

Thanks, Moju!

This fresh, newly captured video from Unitree’s testing grounds showcases the breakneck speed of humanoid intelligence advancement. Every day brings something thrilling!

[ Unitree ]

There should be more robots that you can ride around on.

[ AgileX Robotics ]

There should be more robots that wear hats at work.

[ Ugo ]

iRobot, who pioneered giant docks for robot vacuums, is now moving away from giant docks for robot vacuums.

[ iRobot ]

There’s a famous experiment where if you put a dead fish in current, it starts swimming, just because of its biomechanical design. Somehow, you can do the same thing with an unactuated quadruped robot on a treadmill.

[ Delft University of Technology ]

Mush! Narrowly!

[ Hybrid Robotics ]

It’s freaking me out a little bit that this couple is apparently wandering around a huge mall that is populated only by robots and zero other humans.

[ MagicLab ]

I’m trying, I really am, but the yellow is just not working for me.

[ Kepler ]

By having Stretch take on the physically demanding task of unloading trailers stacked floor to ceiling with boxes, Gap Inc has reduced injuries, lowered turnover, and watched employees get excited about automation intended to keep them safe.

[ Boston Dynamics ]

Since arriving at Mars in 2012, NASA’s Curiosity rover has been ingesting samples of Martian rock, soil, and air to better understand the past and present habitability of the Red Planet. Of particular interest to its search are organic molecules: the building blocks of life. Now, Curiosity’s onboard chemistry lab has detected long-chain hydrocarbons in a mudstone called “Cumberland,” the largest organics yet discovered on Mars.

[ NASA ]

This University of Toronto Robotics Institute Seminar is from Sergey Levine at UC Berkeley, on Robotics Foundation Models.

General-purpose pretrained models have transformed natural language processing, computer vision, and other fields. In principle, such approaches should be ideal in robotics: since gathering large amounts of data for any given robotic platform and application is likely to be difficult, general pretrained models that provide broad capabilities present an ideal recipe to enable robotic learning at scale for real-world applications.
From the perspective of general AI research, such approaches also offer a promising and intriguing approach to some of the grandest AI challenges: if large-scale training on embodied experience can provide diverse physical capabilities, this would shed light not only on the practical questions around designing broadly capable robots, but the foundations of situated problem-solving, physical understanding, and decision making. However, realizing this potential requires handling a number of challenging obstacles. What data shall we use to train robotic foundation models? What will be the training objective? How should alignment or post-training be done? In this talk, I will discuss how we can approach some of these challenges.

[ University of Toronto ]

Reference: https://ift.tt/GBXqx5i

AI bots strain Wikimedia as bandwidth surges 50%

On Tuesday, the Wikimedia Foundation announced that relentless AI scraping is putting strain on Wikipedia's servers. Automated bots s...