Safety culture Injury prevention

Artificial intelligence and on-the-job safety

“The fabric of the future of work”

AI.jpg
Photos: Thinkhubstudio/iStockphoto

Page 2 of 2

Natural language processing

Natural language processing can help safety pros by taking data sets from incident reports, observations and inspections, among other items, and finding insights within them.

Reading over hundreds or thousands of reports, and potentially millions of words, is a time-consuming task for people. Deriving insights from all of that data takes even more time or bandwidth.

In addition, reports may be loosely written narratives or contain unstructured data. Natural language processing has the ability to take those reports and find patterns – such as near misses or incidents happening at certain times or in certain areas of a facility.

During the NIOSH webinar, experts highlighted another use of natural language processing: automated coding of workers’ compensation claims.

One part of natural language processing is sentiment analysis. When programmed correctly, that analysis can identify certain words and phrases that denote feelings or attitudes.

During their Congress & Expo session, Hornsby and Roberts highlighted the use of sentiment analysis on safety culture surveys for better insights. That analysis, in turn, allows for more open-ended answers instead of, say, multiple choice or “prescriptive” answers.

“What has always lacked in those scenarios is the ability to talk openly and freely about what workers are seeing and what they’re experiencing,” Hornsby said.

AI-enabled cameras
Examples of AI-enabled cameras that help with (clockwise from top left): safety violations, ergonomics, PPE use and machine guarding. Photos courtesy of Benchmark ESG in partnership with Integration Wizards.

How to start

Many employers likely already have AI deployed in their business practices, Vietas said, adding that it’s important to understand how the basic concepts of AI may help strengthen workplace safety and health. Naturally, each organization or location within it may have different needs and may have to figure out which programs might work best.

Before employers introduce new technology, though, Hornsby recommends thinking about what safety issues need addressing, to see which technology can aid your organization and employees.

“I think there’s the problem with a lot of people is they fall in love with the technology, this whole ‘shiny object’ syndrome,” Hornsby said. “We need to first think about what the problems are and decide whether or not there’s a technology that can help us.”

Anoter piece of advice from Hornsby: Start small – a proof of concept or pilot program, for example – with one safety issue, instead of trying to tackle a larger one or multiple problems.

“Figure out something that you can get your arms around, get quick buy-in, see what happens,” he said. “Let it inform what you might do on a broader scale. Once you prove the value, improve the concept.”

He pointed out that many technologies can work with existing equipment, such as a facility’s closed-circuit cameras.

Privacy and other concerns

A continuous eye on workers from cameras or wearables will likely raise concerns over privacy and data security. One of the best ways to address these concerns, Hornsby said, is to allow workers to have a “seat at the table” when developing AI strategies and considering any related issues.

“I don’t know how you would build trust if you’re just dumping a technology out there and asking folks to ‘trust us on this,’” he said. “If they have a seat at the table, then they understand the motivations and they understand the objectives. Ultimately, organizations are just trying to find a way to keep people safe.”

Management and safety leaders should remain upfront and open while implementing new technologies. AI shouldn’t be a “black box,” Vietas said during the webinar. It should be “transparently implemented.”

Emerging technologies

In January 2019, the National Safety Council launched Work to Zero, an initiative that explores how emerging technologies can be used to prevent on-the-job-injuries. Learn more at nsc.org/worktozero.

Expertise still needed

With the introduction of new technology in the workplace, a common – and sometimes justifiable – fear among workers is that it’ll make their jobs expendable. However, the experts say that’s not likely to happen with AI and safety pros.

The use of AI requires experts to guide it and keep it on track. During the webinar, Darabi offered the example of AI being used in the mining industry.

“You need people who understand mining and understand the hazards,” he said. “They understand the situations where unsafe events could happen. You need the workers to tell you how they feel and how they can use the technology. So, in order for an AI system to work, you can’t just bring a programmer in and expect everything to work.”

Hornsby pointed out that “management of change” is a big factor in ensuring an AI program continues to gather the right data. An organization might add a new shift, new personnel, new processes or expand its plant.

“All these things introduce new variables that the AI machine learning is going to have to factor in.” he said. “In a lot of cases, it’s going to take humans to help influence that.”

Vietas said AI likely will prove “very complementary to most, if not all, jobs” in the future. Therefore, it’s important to have an understanding of how to employ it and improve “the human outcomes that we’re all interested in,” whether it’s in a corporation or in society.

“It’s a tool,” he added. “It’s just another tool for humans to use.”

Post a comment to this article

Safety+Health welcomes comments that promote respectful dialogue. Please stay on topic. Comments that contain personal attacks, profanity or abusive language – or those aggressively promoting products or services – will be removed. We reserve the right to determine which comments violate our comment policy. (Anonymous comments are welcome; merely skip the “name” field in the comment box. An email address is required but will not be included with your comment.)