U.S. programmers further developed their ai enabled housing solution, an application to help automate Dallas-Fort Worth’s Section 8 voucher program. The app uses Artificial Intelligence (AI), and automation to help voucher holders find rental units, property owners complete contracting and housing authorities conduct inspections. The software and mobile app were released in partnership with the Dallas Housing Authority, which gave access to data from some 16,000 Section 8 voucher holders.
AI has been used in a host of algorithms in medicine, banking and other major industries. But as it has proliferated, studies have shown that AI can be biased against minorities. In housing, AI has helped perpetuate segregation and discrimination. The creators of the app were worried that the AI would promote bias, so they tweaked it so that tenants could search for apartments using their voucher number alone, without providing any other identifying information.
As AI is adopted by more industries and government agencies, U.S. lawmakers want to strengthen and update laws to guard against racially discriminatory algorithms – especially in the absence of federal rules. Since 2019, more than 100 bills related to AI and automated decision systems have been introduced in nearly two dozen states, according to the National Conference of State Legislatures. This year, lawmakers in at least 16 states proposed creating panels to review AI’s impact, promote public and private investment in AI, or address transparency and fairness in AI development.
A bill in California would be the first to require developers to evaluate the privacy and security risks of their software, as well as assess their products’ potential to generate inaccurate, unfair, biased or discriminatory decisions. Under the proposed law, the California Department of Technology would have to approve software before it could be used in the public sector.
A lawyer described algorithms such as the ai app as a gatekeeper to an opportunity that can either perpetuate segregation and redlining or help to end them. He also praised the developers for their decision to omit a person’s name. However, the government cannot rely on small groups of people making decisions that can essentially affect thousands. The government needs to audit these systems to ensure they are integrating equity metrics in ways that do not unfairly disadvantage people.
The app’s developers are sure it would pass any state-mandated test for algorithmic discrimination and it has already been a huge success in Dallas and beyond. The Dallas Housing Authority has used the app to cut the average wait time for an apartment inspection from 15 days to one. Since its launch, Dallas and more than a dozen other housing agencies have added some 20,000 Section 8 units from landlords who were not participating in the program because of the long inspection wait times.
Dallas Housing Authority partnered with the developers to come up with some technology advancements to their workflows and automation so that they could respond in a more timely manner to business partners. The housing authority wanted to ensure that their partners the dealy as a lost lead in terms of working with the voucher program.
The real promise of AI in the housing space is that it may eventually produce greater fairness and equity in ways that we may not have possible before. Lawmakers are keen to make sure that the biases of the analogue world are not repeated in the AI and machine-learning world.
U.S. researchers have been creating AI for a multitude of purposes, such as an AI that can have free-flowing conversations. As reported by OpenGov Asia, the newest conversational artificial intelligence (AI) model, called Language Model for Dialogue Applications (LaMDA) aims to replace artificial, robotic conversations with AI, with more natural dialogues. LaMDA can engage a conversation in a free-flowing way about a seemingly endless number of topics. It is an ability that could unlock more natural ways of interacting with technology and entirely new categories of helpful applications.
The researchers are developing several qualities in LaMDA, including sensibleness, specificity, “interestingness” by assessing whether responses are insightful, unexpected or witty. They also want LaMDA to stick to facts and are investigating ways to ensure LaMDA’s responses are not just compelling but correct.