"Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers." Photo via UH.edu

We are currently in the midst of what some have called the "wild west" of AI. Though healthcare is one of the most heavily regulated sectors, the regulation of AI in this space is still in its infancy. The rules are being written as we speak. We are playing catch-up by learning how to reap the benefits these technologies offer while minimizing any potential harms once they've already been deployed.

AI systems in healthcare exacerbate existing inequities. We've seen this play out into real-world consequences from racial bias in the American justice system and credit scoring, to gender bias in resume screening applications. Programs that are designed to bring machine "objectivity" and ease to our systems end up reproducing and upholding biases with no means of accountability.

The algorithm itself is seldom the problem. It is often the data used to program the technology that merits concern. But this is about far more than ethics and fairness. Building AI tools that take account of the whole picture of healthcare is fundamental to creating solutions that work.

The Algorithm is Only as Good as the Data

By nature of our own human systems, datasets are almost always partial and rarely ever fair. As Linda Nordling comments in a Nature article, A fairer way forward for AI in healthcare, "this revolution hinges on the data that are available for these tools to learn from, and those data mirror the unequal health system we see today."

Take, for example, the finding that Black people in US emergency rooms are 40 percent less likely to receive pain medication than are white people, and Hispanic patients are 25 percent less likely. Now, imagine the dataset these findings are based on is used to train an algorithm for an AI tool that would be used to help nurses determine if they should administer pain relief medication. These racial disparities would be reproduced and the implicit biases that uphold them would remain unquestioned, and worse, become automated.

We can attempt to improve these biases by removing the data we believe causes the bias in training, but there will still be hidden patterns that correlate with demographic data. An algorithm cannot take in the nuances of the full picture, it can only learn from patterns in the data it is presented with.

Bias Creep

Data bias creeps into healthcare in unexpected ways. Consider the fact that animal models used in laboratories across the world to discover and test new pain medications are almost entirely male. As a result, many medications, including pain medication, are not optimized for females. So, it makes sense that even common pain medications like ibuprofen and naproxen have been proven to be more effective in men than women and that women tend to experience worse side effects from pain medication than men do.

In reality, male rodents aren't perfect test subjects either. Studies have also shown that both female and male rodents' responses to pain levels differ depending on the sex of the human researcher present. The stress response elicited in rodents to the olfactory presence of a sole male researcher is enough to alter their responses to pain.

While this example may seem to be a departure from AI, it is in fact deeply connected — the current treatment choices we have access to were implicitly biased before the treatments ever made it to clinical trials. The challenge of AI equity is not a purely technical problem, but a very human one that begins with the choices that we make as scientists.

Unequal Data Leads to Unequal Benefits

In order for all of society to enjoy the many benefits that AI systems can bring to healthcare, all of society must be equally represented in the data used to train these systems. While this may sound straightforward, it's a tall order to fill.

Data from some populations don't always make it into training datasets. This can happen for a number of reasons. Some data may not be as accessible or it may not even be collected at all due to existing systemic challenges, such as a lack of access to digital technology or simply being deemed unimportant. Predictive models are created by categorizing data in a meaningful way. But because there's generally less of it, "minority" data tends to be an outlier in datasets and is often wiped out as spurious in order to create a cleaner model.

Data source matters because this detail unquestionably affects the outcome and interpretation of healthcare models. In sub-Saharan Africa, young women are diagnosed with breast cancer at a significantly higher rate. This reveals the need for AI tools and healthcare models tailored to this demographic group, as opposed to AI tools used to detect breast cancer that are only trained on mammograms from the Global North. Likewise, a growing body of work suggests that algorithms used to detect skin cancer tend to be less accurate for Black patients because they are trained mostly on images of light-skinned patients. The list goes on.

We are creating tools and systems that have the potential to revolutionize the healthcare sector, but the benefits of these developments will only reach those represented in the data.

So, what can be done?

Part of the challenge in getting bias out of data is that high volume, diverse and representative datasets are not easy to access. Training datasets that are publicly available tend to be extremely narrow, low-volume, and homogenous—they only capture a partial picture of society. At the same time, a wealth of diverse health data is captured every day in many healthcare settings, but data privacy laws make accessing these more voluminous and diverse datasets difficult.

Data protection is of course vital. Big Tech and governments do not have the best track record when it comes to the responsible use of data. However, if transparency, education, and consent for the sharing of medical data was more purposefully regulated, far more diverse and high-volume data sets could contribute to fairer representation across AI systems and result in better, more accurate results for AI-driven healthcare tools.

But data sharing and access is not a complete fix to healthcare's AI problem. Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers. At the end of the day, we want to teach our algorithms to make good choices but we are still figuring out what good choices should look like for ourselves.

AI presents the opportunity to bring greater personalization to healthcare, but it equally presents the risk of entrenching existing inequalities. We have the opportunity in front of us to take a considered approach to data collection, regulation, and use that will provide a fuller and fairer picture and enable the next steps for AI in healthcare.

------

Angela Wilkins is the executive director of the Ken Kennedy Institute at Rice University.

In a guest column, these lawyers explain the pros and cons of using AI for hiring. Photo via Getty Images

Here's what Houston employers need to know about using artificial intelligence in the hiring process

guest column

Workplace automation has entered the human resource department. Companies rely increasingly on artificial intelligence to source, interview, and hire job applicants. These AI tools are marketed to save time, improve the quality of a workforce, and eliminate unlawful hiring biases. But is AI incapable of hiring discrimination? Can a company escape liability for discriminatory hiring because, "the computer did it?"

Ultimately, whether AI is a solution or a landmine depends on how carefully companies implement the technology. AI is not immune from discrimination and federal law holds companies accountable for their hiring decisions, even if those decisions were made in a black server cabinet. The technology can mitigate bias, but only if used properly and monitored closely.

Available AI tools

The landscape of AI technology is continually growing and covers all portions of the hiring process — recruiting, interviewing, selection, and onboarding. Some companies use automated candidate sourcing technology to search social media profiles to determine which job postings should be advertised to particular candidates. Others use complex algorithms to determine which candidates' resumes best match the requirements of open positions. And some employers use video interview software to analyze facial expressions, body language, and tone to assess whether a candidate exhibits preferred traits.

Federal anti-discrimination law

Although AI tools likely have no intent to unlawfully discriminate, that does not absolve them from liability. This is because the law contemplates both intentional discrimination (disparate treatment) as well as unintentional discrimination (disparate impact). The larger risk for AI lies with disparate impact claims. In such lawsuits, intent is irrelevant. The question is whether a facially neutral policy or practice (e.g., use of an AI tool) has a disparate impact on a particular protected group, such as on one's race, color, national origin, gender, or religion.

The Equal Employment Opportunity Commission, the federal agency in charge of enforcing workplace anti-discrimination laws, has demonstrated an interest in AI and has indicated that such technology is not an excuse for discriminatory impacts.

Discrimination associated with AI tools

The diversity of AI tools means that each type of technology presents unique potential for discrimination. One common thread, however, is the potential for input data to create a discriminatory impact. Many algorithms rely on a set of inputs to understand search parameters. For example, a resume screening tool is often set up by uploading sample resumes of high-performing employees. If those resumes favor a particular race or gender, and the tool is instructed to find comparable resumes, then the technology will likely reinforce the existing homogeneity.

Some examples are less obvious. Sample resumes may include employees from certain zip codes that are home to predominately one race or color. An AI tool may favor those zip codes, disfavoring applicants from other zip codes of different racial composition. Older candidates may be disfavored by an algorithm's preference for ".edu" email addresses. In short, if a workforce is largely comprised of one race or one gender, having the tool rely on past hiring decisions could negatively impact applicants of another race or gender.

Steps to mitigate risk

There are a handful of steps that employers can take to use these technologies and remain compliant with anti-discrimination laws.

First, companies should demand that AI vendors disclose as much as possible about how their products work. Vendors may be reticent to disclose details about proprietary information, but employers will ultimately be responsible for discriminatory impacts. Thus, as part of contract negotiations, a company should consider seeking indemnification from the vendor for discrimination claims.

Second, companies should consider auditing the tool to ensure it does not yield a disparate impact on protected individuals. Along the same lines, companies should be careful in selecting input data. If the inputs reflect a diverse workforce, a properly functioning algorithm should, in theory, replicate that diversity.

Third, employers should stay abreast of developments in the law. This is an emerging field and state legislators have taken notice. Illinois recently passed regulation governing the use of AI in the workplace and other states, including New York, have introduced similar bills.

AI can solve many hiring challenges and help cultivate a more diverse and qualified workforce. But the tools are often only as unbiased as the creators and users of that technology. Careful implementation will ensure AI becomes a discrimination solution — not a landmine.

------

Kevin White is a partner and Dan Butler is an associate with Hunton Andrews Kurth LLP, which has an office in Houston.

Artificial intelligence is changing Houston — one industry at a time. Photo via Getty Images

3 ways artificial intelligence is changing Houston's future

Guest column

Artificial intelligence is the buzzword of the decade. From grocery shopping assistance to personal therapy apps, AI has sunk its teeth into every single industry. Houston is no exception to the AI boom. Enterprise-level companies and startups are already flocking to H-town to make their mark in AI and machine learning.

Since the world is generating more data every minute — 1,736 terabytes to be exact — Houston-based companies are already thinking ahead about how to make sense of all of that information in real-time. That's where AI comes in. By 2021, 80 percent of emerging technologies will have AI foundations — Houston is already ninth on the list of AI-ready cities in the world.

AI and machine learning can process large amounts of data quickly and use that data to inform decisions much like a human would. Here are three ways Houston-based companies are using these emerging technologies to revolutionize the city's future.

Health care

The health care industry is primed for AI's personalization capabilities. Each patient that doctors and nurses encounter has different symptoms, health backgrounds, and prescriptions they have to remember. Managing that amount of information can be dangerous if done incorrectly. With AI, diseases are diagnosed quicker, medications are administered more accurately, and nurses have help monitoring patients.

Decisio Health Inc., a Houston-based health tech startup has already made its mark in the healthcare industry with its AI software helping to tackle the COVID-19 pandemic. Their software, in collaboration with GE Healthcare Inc, allows health care providers to remotely monitor patients. By looking at data from ventilators, patient monitoring systems, health records, and other data sources, doctors can make better decisions about patients from a safe distance.

Climate change

Climate change isn't solved overnight. It's an issue that covers water salinity, deforestation, and even declining bee populations. With a problem as large as climate change, huge amounts of data are collected and need to be analyzed. AI can interpret all of that information, show possible future outcomes, track current weather patterns, and find solutions to environmental destruction.

One Houston-based company in the energy tech industry, Enovate Upstream, has created a new AI platform that will help digitize the oil and gas sector. Their AI-powered platform looks at data from digital drilling, digital completions, and digital production, to give oil companies real-time production forecasting. Their work will hopefully make their oil production more efficient and reduce their carbon emission output. Since oil drilling and fracking are a major cause for concern around climate change, their work will make a difference in slowing climate change and make their industry as a whole more climate-conscious.

Energy

Energy is an industry rich with data opportunities—and as Houston's energy sector grows, AI has become a core part of their work. Houston's large influence in the energy sector has primed it for AI integration from startups like Adapt2 Solutions Inc. By using AI and machine learning in their software, they hope to help energy companies make strategic predictions on how to serve energy to the public efficiently. Their work has become especially important in the wake of COVID-19 and the resulting changing energy needs.

Another Houston-based company using AI to influence the energy industry is the retail energy startup Evolve Energy. Their AI and machine learning system help customers find better prices on fluctuating renewable resource—helping them save money on electricity and reducing emissions. The positive feedback from the public on their AI model has shown how energy companies are using emerging technologies like AI in a positive way in their communities.

The bottom line

Houston is more primed than most cities to integrate AI and machine learning into every industry. While there are valid concerns as to how much we should lean on technology for necessary daily tasks, it's clear that AI isn't going anywhere. And it's clear that Houston is currently taking the right steps to continue its lead in this emerging AI market.

------

Natasha Ramirez is a Utah-based tech writer.

James Yockey is a co-founder of Landdox, which recently integrated with ThoughtTrace. Courtesy of Landdox

These two Houston software companies are making contracts less cumbersome for oil and gas companies

Team work

The biggest asset of most oil and gas companies is their leasehold: the contracts or deeds that give the company the right to either drill wells and produce oil and gas on someone else's land, or give them title to that land outright. A typical oil and gas company is involved in thousands of these uniquely negotiated leases, and the software to keep these documents organized hasn't been updated in more than a decade, says James Yockey, founder of Houston-based Landdox.

Landdox does just that: provides an organizational framework for companies' contracts and leaseholds. The company recently entered into an integration with Houston-based ThoughtTrace, an artificial intelligence program that can scan and pull out key words and provisions from cumbersome, complicated contracts and leaseholds.

With this integration, companies can use ThoughtTrace to easily identify key provisions of their contracts, and then sync up those provisions with their Landdox account. From there, Landdox will organize those provisions into easy-to-use tools like calendars, reminders and more.

The framework behind the integration
The concept behind Landdox isn't entirely new — there are other software platforms built to organize oil and gas company's assets — but it's the first company in this space that's completely cloud-based, Yockey says.

"Within these oil and gas leases and other contracts are really sticky provisions … if you don't understand them, and you're not managing them, it can cause you to forfeit a huge part of your asset base," Yockey says. "It can be a seven-, eight-, or nine-digit loss."

These contracts and leases can be as long as 70 or 80 pages, Yockey says, and have tricky provisions buried in them. Before the integration with ThoughtTrace, oil and gas companies would still have to manually pour over these contracts and identify key provisions that could then be sent over to Landdox, which would organize the data and documents in an easy-to-use platform. The ThoughtTrace integration removes a time-consuming aspect of the process for oil and gas companies.

"[ThoughtTrace] identifies the most needle moving provisions and obligations and terms that get embedded in these contracts by mineral owners," Yockey says. "It's a real source of leverage for the oil and gas companies. You can feed ThoughtTrace the PDF of the lease and their software will show you were these provisions are buried."

The origin story
Landdox was founded in 2015, and is backed by a small group of angel investors. Yockey says the investors provided a "little backing," and added that Landdox is a "very capital-efficient" software company.

Landdox and ThoughtTrace connected in 2017, when the companies were working with a large, private oil and gas company in Austin. The Austin-based oil and gas company opted to use Landdox and ThoughtTrace in parallel, which inspired the two companies to develop an integrated prototype.

"We built a prototype, but it was clear that there was a bigger opportunity to make this even easier," Yockey says. "To quote the CEO of ThoughtTrace, he called [the integration] an 'easy button.'"

The future of ERP software
Landdox's average customer is a private equity-backed E&P or mineral fund, Yockey says, thought the company also works with closely held, family-owned companies. Recently, though, Landdox has been adding a new kind of company to its client base.

"What's interesting is we're starting to add a new customer persona," Yockey says. "The bigger companies – the publicly traded oil and gas companies –have all kinds of different ERP (Enterprise Resource Planning) software running their business, but leave a lot to be desired in terms of what their team really needs."

At a recent North American Prospect Expo summit, Yockey says that half a dozen large capitalization oil and gas producers invited Landdox to their offices, to discuss potentially supplementing the company's ERP software.

"Instead of trying to be all things to all people, we stay in our lane, but find cool ways to connect with other software (companies)," Yockey says.

Over half of Houston business leaders say their company has already enabled AI, blockchain, and extended reality technology. Getty Images

Business leaders in Houston have a surprisingly high tech adoption rate

Early bird gets the worm

When it comes to enabling new technologies to advance business practices, Houston business leaders are ahead of the curve. According to a new study, the majority of the companies surveyed are already using artificial intelligence, blockchain, and extended reality today.

The global study, Technology Vision 2019, was conducted by Accenture and included surveys from 6,600 business and IT executives around the world, including 100 in Houston. Dallas was the only other Texas market surveyed, along with nine other major United States metros — Atlanta, Boston, Chicago, Detroit, Minneapolis, New York City, San Francisco, Seattle, and Washington D.C.

Of the 100 respondents, 91 said that innovation efforts have accelerated within their organization over the past three years because of new technology, and 80 said that while they feel their employees are digitally savvy, they are "waiting" for the company's technology to catch up. However, when it comes to the need to reskill employees due to emerging tech in the workplace, 47 percent says that need will happen within the next two years.

The survey also focused on three distinct technologies — AI, blockchain, and extended reality, which includes augmented reality, virtual reality, and mixed reality. XR responses indicate that 66 percent of business leaders have already used some sort of version of XR either in one or more of their business units (37 percent) or are piloting the technology (29 percent).

The numbers for adoption for AI is similar, with 65 percent of leaders saying they have introduced AI tech in the workplace already —nearly 2 in 5 have already adopted somewhere within the company, while over 1 in 4 say their company has an AI pilot program.

Blockchain, according to the study, falls further down the spectrum in Houston companies. Only 15 percent of the companies have a pilot program, but 42 percent have blockchain technology already in use in one or more business units — for a total of 57 percent adoption rate.

With 5G on the horizon, almost all respondents — 79 percent — say the technology is going to revolutionize their industry in terms of how they provide products or services to their clients. Almost half said that impact will happen and jobs will be altered within the next three years.

Brian Richards, managing director at Accenture, oversees the company's Houston Innovation Hub. The hub welcomes in business leaders who are utilizing Accenture's services to ideate and then implicate innovative technologies. At a recent panel in the Accenture office, Richards spoke to emerging tech in Houston and said there's been no shortage of leaders wanting to move the needle on new tech.

"I've never seen [corporations] more motivated than they are right now to be able to think differently on how they are able to engage Houston," he said.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Comcast donates tech, funds to support diversity-focused nonprofit

gift of tech

A Houston organization focused on helping low-income communities by providing access to education, training, and employment has received a new donation.

Comcast’s Internet Essentials program announced the a donation of a $30,000 financial grant and 1,000 laptops to SERJobs. The gift is part of a new partnership with SERJobs that's aimed at educating and equipping adults with technical skills, including training on Microsoft Office and professional development.

“SERJobs is excited to celebrate 10 years of Comcast's Internet Essentials program,” says Sheroo Mukhtiar, CEO, SERJobs, in a news release. “The Workforce Development Rally highlights the importance of digital literacy in our increasingly virtual world—especially as technology and the needs of our economy evolve. We are grateful to Comcast for their ongoing partnership and support of SERJobs’ and our members.”

For 10 years Comcast's Internet Essentials program has connected more than 10 million people to the Internet at home — most for the first time. This particular donation is a part of Project UP, Comcast’s comprehensive initiative to advance digital equity.

“Ten years is a remarkable milestone, signifying an extraordinary amount of work and collaboration with our incredible community partners across Houston,” says Toni Beck, vice president of external affairs at Comcast Houston, in the release.

“Together, we have connected hundreds of thousands of people to the power of the Internet at home, and to the endless opportunity, education, growth, and discovery it provides," she continues. "Our work is not done, and we are excited to partner with SERJobs to ensure the next generation of leaders in Houston are equipped with the technical training they need to succeed in an increasingly digital world.”

It's not the first time the tech company has supported Houston's low-income families. This summer, Comcast's Internet Essentials program and Region 4 Education Service Center partnered with the Texas Education Agency's Connect Texas Program to make sure Texas students have access to internet services.

Additionally, Comcast set up an internet voucher program with the City of Houston last December, and earlier this year, the company announced 50 Houston-area community centers will have free Wi-Fi connections for three years. Earlier this year, the company also dedicated $1 million to small businesses struggling due to the pandemic that are owned by Black, Indigenous, and People of Color.

President Joe Biden appoints Houston green space guru to lofty national post

new gig

Aprominent and nationally acclaimed Houston parks presence has just received a hefty national appointment. President Joe Biden has named Beth White, Houston Parks Board president and CEO, the chair of the National Capital Planning Commission (NCPC), the organization announced.

The NCPC, established by Congress in 1924, is the federal government’s central planning agency for the National Capital Region. The commission provides overall guidance related to federal land and buildings in the region. Functions include reviewing the design of federal and local projects, overseeing long-range planning for future development, and monitoring capital investment by federal agencies.

Fittingly, White was initially appointed to NCPC as the at-large presidential commissioner in January 2012, per a press release. She was reappointed for another six-year term in 2016. Most recently, White served as the commission’s vice-chair.

“I’m honored to chair the National Capital Planning Commission and work with my fellow commissioners to build and sustain a livable, resilient capital region and advance the Biden Administration’s critical priorities around sustainability, equity, and innovation,” White said in a statement.

Before joining Houston Parks Board in 2016, White served as the director of the Chicago Region Office of The Trust for Public Land, where she spearheaded development of The 606 public park and was instrumental in establishing Hackmatack Wildlife Refuge.

Renowned in the Windy City, she also was managing director of communications and policy for the Chicago Housing Authority; chief of staff for the Chicago Transit Authority’s Chicago Transit Board; and assistant commissioner for the City of Chicago’s Department of Planning and Development. She was the founding executive director of Friends of the Chicago River, and currently serves on the Advisory Board for Urban Land Institute Houston.

The graduate of Northwestern and Loyola universities most recently received the Houston Business Journal’s 2021 Most Admired CEO award, per her bio.

------

This article originally ran on CultureMap.