Skip to content
Legal Technology

How employers rise to the new challenges of AI and the needs of their employees

Megha Jain  Lawyer

· 8 minute read

Megha Jain  Lawyer

· 8 minute read

As employers seek ways to incorporate the benefits that AI might bring to work operations, one law firm has begun a process to better integrate AI into lawyering

Artificial intelligence (AI) and the practice of law has evolved into an interesting relationship over the past few years. On one hand, there are many of those within law firms and corporate law departments that have feared AI, and others that have embraced what it may be able to do for them. From e-discovery platforms to document review, lawyers are often asked to draft and extract pertinent information from copious amount of reading material. Several firms and companies have embraced the utility of AI to perform those functions.

One firm that has embraced AI and used it to help facilitate a new practice group is BakerHostetler, which has initiated efforts to better integrate AI into lawyering. BakerHostetler has responded to client needs related to AI through their Emerging Technology (ET) practice made up of practitioners with deep AI experience and their IncuBaker™ program, which originated as a Research & Development team and now provides Legal Tech Consulting offering an expanded suite of services from Enterprise and Legal Analytics, Intelligent Automation, Software Selection, Privacy Management, Contract Analytics, and more.

Right now, the ET practice includes several experienced practitioners who have advised dozens of clients on AI policies and related legal requirements, and 14 adjacent professionals on the IncuBaker team, themselves variously lawyers, engineers, and data scientists, who are working to innovate AI technology on behalf of both the firm’s clients and BakerHostetler’s law firm practices and processes generally.

Indeed, the ET practice and IncuBaker team has advised well over 50 clients, including multinational manufacturers, major and regional health care companies, oil and gas clients, and almost every other industry imaginable (including pharmaceutical manufacturers, retail companies, and heavily regulated ― and scrutinized ― organizations). Both team’s client base is continuously expanding, and the lessons the team has learned could help companies and attorneys tackle emerging AI complexities.

AI in an evolving world

On April 6, New York City’s Department of Consumer and Worker Protection (DCWP) adopted highly anticipated final rules implementing the city’s law regulating the use of automated employment decision tools (AEDT) in hiring. The city’s new laws went into effect on July 5. The AEDT law, which was passed last January, restricts the use of automated employment decision tools and AI by employers and employment agencies by requiring that such tools be subjected to bias audits, and that employers and employment agencies notify employees and job candidates that such tools are being used to evaluate them.


For employers looking to make their own leap into building out AI tools, there are several ways they can ease their organization’s transition of AI into its day-to-day operations with sound labor and employment practices.


Even with these restrictions, however, the Emerging Tech team and the IncuBaker professionals are finding use cases for such tools that still abide by the regulations. “There are numerous applications that can extract clauses; for example, you can feed the contract and it could identify the clause needed for a case,” says James Sherer, Partner at BakerHostetler, who co-leads the Emerging Technology team of BakerHostetler’s Digital Assets and Data Management group while directing the firm’s Artificial Intelligence and Information Governance engagements.

Ariana Dindiyal, an associate at BakerHostetler who works on AI legal research for the firm, has worked with and in support of high-profile clients in developing their AI initiatives, which often concern the development of technology, protection of financial services, and data privacy concerns with AI as well as making sure employers are compliant. In practice, she has researched regulatory protections around AI, as well as the use of chatbots in marketing and advertising. Clients are looking at a variety of use cases with AI, she says, including code generation, AI-generated images, and how to interface with both employees and consumers in an AI-enabled world.

“There is a fear that AI is going to take over jobs,” Dindiyal says, adding that “it is hard to say how AI will be used for cases right now.” She observes that “things are still very much in the exploratory phase [and] that it is hard to concretely say where AI is headed” in the legal profession, and more particularly, in litigation.

Katherine Lowry, Chief Information Officer and Co-Chair of the Emerging Technology team of BakerHostetler’s Digital Assets and Data Management Group and Head of IncuBaker, says that “as tech increases, vendors are hitting us with new features that really need to be explored.”

She notes that the IncuBaker dashboard is monitoring a number of different models as it stands, but even with pressure from vendors to do more, the firm is taking time to discern what applications will work best for the firm and its clients. “We have to do more due diligence into risks and technology, and understand what happens with our data,” Lowry says. “[Clients] are also asking the firm, are you going to create something with AI, how are you using our data, and how are you using ChatGPT and other generative AI applications? Innovation takes discipline and time to study how these systems work in order to provide the best service to our clients.”

How employers can protect themselves on AI implementation

For employers looking to make their own similar leap into building out AI tools, there are several ways they can ease their organization’s transition of AI into its day-to-day operations with sound labor and employment practices. These include:

Hiring practices — First, make sure that hiring managers who are using AI tools are aware that these tools can create bias and discriminatory hiring practices when left on their own. As Sherer says, “Regular audits are a must when scanning through resumes ― as we become more tech-focused, we have to add back in the human element.” The Emerging Tech team recommends that users run audits periodically (such as once a month or once every three weeks) to make sure that there are no unintended consequences with the firm’s hiring practices.

Keep informed — Employers should pay attention to their organization’s jurisdiction and what is being implemented there. For example, New York City adopted the AEDT guidance for AI use, and other jurisdictions could implement similar guidance. To be safe, the Emerging Tech team recommends checking your firm’s jurisdiction frequently as laws evolve. There are a few resources that can help employers do this, such as the International Association of Privacy Professionals’ AI legislation AI legislation trackers. These trackers include laws already in effect, including the Illinois Artificial Intelligence Video Interview Act (820 ILCS 42) and Maryland’s Labor and Employment – Use of Facial Recognition Services – Prohibition (H.B. 1202), but also consider the privacy laws that speak to automated decision-making and profiling to be part of AI law proper. This encompasses a number of other states ― California, Colorado, Connecticut, Virginia, Utah, and others ― that are looking at some very specific considerations, such as Georgia’s law regarding automated eye assessments and North Dakota’s act to clarify that AI is not a person.

The soft law approach — As AI continues to develop into an integral part of human life, humans need to maintain its positive benefits while controlling the risks. The soft law approach does just that, in that it accounts for the rapidly progressing AI environment and the concurrently evolving needs of humans. As discussed in an article by Dindiyal, Sherer, and their co-author Cat Casey titled AI-human Interaction – Soft law considerations, 1 JARWA 4 (360-370), the recommended course of action under soft law is to keep good records and maintain proper compliance with documents as much as possible where there is still a gray or soft approach to this legal issue, and ambiguity in the law.

Compliance with ADA accommodations — Employers should explore the increased ability for the use of AI tools as part of reasonable accommodations for employees, such as increased use of AI in automated readers to help with those employees with visibility issues.

ABA rules — Lawyers should make sure they check the American Bar Association (ABA) rules when using AI in the practice of law. In the Model Rules, although AI is only briefly mentioned in ABA Resolution 112 (2019), technology competence and continuing legal education are stressed throughout. And as some lawyers have found out, they cannot solely rely on AI to analyze and provide a comprehensive legal view with the most up-to-date information. In addition, when using AI tools and dealing with clients, lawyers must make sure that their attorney-client privilege is maintained and that there are no active breaches of data and confidentiality with the new tools.

By following these approaches and regulations, companies and the legal partners they work with can not only keep up with needed regulation but answer the top question that is on stakeholder and board member agendas.

As Adele Hogan, partner at Otterbourg P.C. and Head of the firm’s Securities Practice Group, says, “Board members are increasingly asking companies to explain AI for employment and other uses — it’s the top thing they are looking at — people are asking from both a liability management and a business perspective.”

More insights