Generative AI continues to take the world by storm, but there are growing concerns this technology could, if not aggressively managed and regulated, do a great deal of harm. In addition to fears about the technology making decisions and doing things autonomously against our interests, other concerning aspects involve the related training sets.
These training sets will increasingly capture everything an employee does, that data can be used to assess employee productivity; track the creation of confidential documents, offerings and products; and eventually be used to create a digital twin of the organization that has employed the technology.Â
Let’s talk about each in turn.
Misusing training sets to gauge employee ‘productivity’
As employees increasingly use generative AI, it will capture everything they do. Using that data to monitor what an employee does during the workday would seem an obvious use for this training data. But employees will likely feel their privacy is being violated. And if care isn’t taken to tie worker behavior to results, companies could make bad decisions.
For instance, an employee who works long hours but is relatively inefficient might be seen as better than an employee who works short hours but is highly efficient. If the focus is on hours worked instead of results, not only will the training set favor behavior that is inefficient, but efficient employees that should be kept on board will be managed out. Â
The right way is to do this is with the permission of the employee and the assurance that AI will be used to enhance, not replace; the focus should be on efficiency, not outright hours worked. This way, the training set can be used to create more efficient human tools and digital twins and train employees on how to be more efficient.
Employees who know AI-based tools will be more helpful than punitive are more likely to embrace the technology.
Security is a must
There is the potential for another danger: the data sets being created by capturing employee behavior could themselves be highly risky. That’s because they could include highly proprietary products, processes and internal operations that could be used by competitors, governments, and hostile actors to gain insights about a firm’s operations.Â
Access to a training set from an engineer, engineering manager, or executive could provide deep insights into how they make decisions, what decisions they’ve made, plans for future products and their status, problems within the company — and secrets a company would prefer to remain secret.Â
Even if a specific source is hidden, a smart researcher could, just from the nature and detail of the content, determine who contributed it and what the employee does in substantial detail. That information could be highly beneficial to a hostile actor or corporate rival and needs to be protected. Since these tools enhance individual employees’ work, the likelihood of it escaping with a departing employee or one that is compromised in their home office is high.Â
Protecting against that is critical to the continued operations of a company.
It gets better — and worse
Once you aggregate training sets across a company, you could gain insights about the firm’s operations that could lead to a far more efficient and profitable company. (Of course, this same information in the hands of a regulator or hostile attorney could provide nearly unimpeachable evidence of wrongdoing.) Or imagine a competitor gaining access to this kind of information; they could effectively create a digital clone of the firm — and use it to better anticipate and more aggressively respond to competitive actions by the company using generative AI.Â
This level of competitive exposure is unprecedented and, should a competitor gain access to the firm’s training files, a rival could effectively push the compromised company out of business.Â
Generative AI is a real game-changer, but it comes with risks. We know it’s not yet mature, we know its answers can’t always be trusted, and we know it can be used to create avatars designed to fool us into buying things we don’t seem to need. And while it brings opportunities to help employee productivity, it can become a massive security risk.
Here’s hoping you and your company learn how to use it right.Â
Copyright © 2023 IDG Communications, Inc.
This story originally appeared on Computerworld