Blockchain platform BlockApps raises $41M to build enterprise-grade infrastructure

With $1.5M in seed funding, Humanitas aims to help enterprises handle algorithmic bias

Posted on

Humanitas Technologies, an artificial intelligence startup building a data pool designed to fight algorithmic bias for diversity and inclusion, today announced that it has raised $1.5 million in seed funding.

Investors included Francoise Brougher, Jeff Walker, Mark Tercek, as well as founders and executives from big-name companies such as Google AI, Expedia, Meta and more.

The vision of Humanitas is to help businesses avoid the pitfalls of biases in demographic data that they use to drive their marketing algorithms. Bias can emerge in algorithms because the data used to train them is often limited and doesn’t always represent a diverse segment of the communities the business or enterprise is attempting to target with its marketing or programs.

Speaking with SiliconANGLE, Humanitas co-founder and Chief Executive Phil Chow explained that another problem is that big businesses simply see diversity, equity and inclusion as difficult and costly issues to implement. That leads them to continue using biased algorithms, which can leave out women and minority communities.

“Companies view this as an expense and as long as they view it as an expense, it’ll only be a ‘nice to have,’” Chow said.

To overcome this situation, Humanitas provides easy-to-use application programming interfaces that allow developers to tap into its insights and data with ease. That strips away the friction that companies might have engineering inclusion into their marketing, community development and algorithmic diversity development, meaning that it’s no longer just an expense.

Humanitas gets its data from a large number of sources, including millions of public and partnered nonprofit sources. It then aggregates and de-identifies that data so it can’t be used to identify individual people it’s sourced from. The data can then be used to provide less biased information about the actual socioeconomic and demographic data of a locality for marketing, finance or other algorithmic decisions.

“People of color will make up a majority of the American working class by 2032,” Chow noted. “Algorithms powering enterprises today are not designed to account for this seismic demographic shift.”

With the pandemic, Chow added, many companies have been shifting focus even more heavily toward local and personal marketing. However, the algorithms they use lack a robust or nuanced understanding of the people who live in those localities because they only “see” a segmented view of behavior based on a set of shopping habits that lacks an understanding of values outside of the business.

With this financing, Humanitas intends to expand its engineering capabilities to build on its existing data aggregation capabilities, enhance its API and make it easier for enterprises to implement its tools.

Chow said his vision is to see enterprise and retail businesses become less biased, racist and sexist in their use of algorithmic marketing and other platforms.

“These algorithms that we’re using are mainly biased for people like you and me,” Chow said, referring to white and East Asian-identified males. “The graphs that we built upon for social media and e-commerce are you and me, but there are going to be a lot more people not you and me in the future and it’s not calibrated for that. If I look into the current Wikipedia data or demographic data, it can’t account for that. It’s not about the number of rows of data, it’s the number of columns and how we can better complement the unique understanding of society.”

Image: Blue Planet Studio

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *