Emerging technologies such as facial recognition systems and pedestrian detection for self-driving cars have repeatedly been found to identify lighter skin tones more accurately than darker ones. Disparities such as this have been shown to negatively affect people of color and has become a topic of concern for A.I. and robotics researchers in the United States. Experts attribute the problem to the training models these technologies are developed on, which are largely developed by white and male engineers. Several groups like Black in Computing, Black in Robotics, and the Algorithmic Justice League are calling on the tech community to recognize that inherent biases are being replicated in developing technologies like A.I. and robotics. One solution, proposed by Professor Chris Crawford of the University of Alabama, is “having more folks that look like the United States population at the table when technology is designed.” One topic of disagreement among various groups, however, relates to developing A.I. and robotics technologies for government authorities. Some argue that robotics research should be kept away from law enforcement because they cannot be trusted to responsibly use the types of technologies being developed. Advocates of this policy point to the history of brutality and racism towards people of color by law enforcement in the United States and argue that A.I. and robotics technologies will only make it easier for the police to continue such behavior. Still, others counter this saying a zero-tolerance stance is too blunt where the solution is much more nuanced, and that robots can “make police work safer for officers and civilians.”
Apps have become ubiquitous solutions to many ‘problems,’ whether real or imagined. This has, in-turn, created an environment where companies, including law firms, may feel the need to create an app even when they don’t necessarily know who the apps target audience is or how much it will cost to maintain them over time. Eric Goldman, professor at Santa Clara University School of Law and co-director of its High Tech Law Institute say these questions are crucial for law firms to answer when considering developing an app. Gabriel Cheong, a Massachusetts attorney who helped develop an app to calculate complex child support payments, suggests firms consider whether an apps builds a firm's long-term brand or will convert prospective clients into actual clients. “If your app doesn’t add to a functionality that already exists on a smartphone, then it’s useless." These perspectives suggest that firms would benefit from analyzing data and thinking about the purpose of an app before developing them.
Paper use has been on the decline for decades, but has not yet disappeared. While digital products like Document or Case Management Systems (DMS and CMS, respectively) enable law firms and court systems to widely access shared documents, paper has remained in the industry for a number of reasons. Sometimes its use can be attributed to inertia and familiarity, and other times, it can be because some prefer working with physical paper as a less distractive mode of reading. The pandemic, however, has forced firms to leverage their full suite of digital tools and the proof is in the numbers: “[a] Magic Circle law firm recently mentioned in conversation that they had 93% fewer requests to retrieve paper documents from their storage locations during the lockdown period compared to pre-lockdown.”
A.I. technologies such as machine learning require vast amounts of data to produce results. Data is constantly generated, but the ability to capture and store it in a way that makes it functional for research can be prohibitively complex and expensive for some who wish to make use of it, especially in an EU jurisdiction where the GDPR further complicates many forms of data collection. While the GDPR can be applauded in its attempt to provide citizens control of their data, it can also work against researchers who rely on data to make advancements for the common public good such as “better disease diagnostics” and improving public services. The Data Governance Act, proposed by the European Commission on November 25, 2020, is meant to ease the hard restrictions and barriers that currently limit EU firms from taking fuller advantages of data repositories. A press release from the European Commission announcing the proposal states, “[t]he Regulation will facilitate data sharing across the EU and between sectors to create wealth for society, increase control and trust of both citizens and companies regarding their data, and offer an alternative European model to data handling practice[s] of major tech platforms.”
In November 2018, several EU consumer organizations filed a privacy complaint against Google arguing the tech firm’s use of location data violated terms of the GDPR. Two years later the complaint remains unresolved and Google continues to profit off its use of location data on EU citizens. Part of the delay comes from the fact that “Ireland’s data regulator has to deal with a disproportionate number of multinational tech companies, given how many have established their EU base in the country.” These types of delays enable big tech firms “to (superficially) tweak [their] practices” and to continue skirting regulation while they cover their tracks with “misleading PR campaigns.” Each EU member state administers its own Data Protection Authority (DPA) which can further complicate the process of filing a complaint against big tech firms because the litigation will often involve multiple jurisdictions. Perhaps it might be better to have a centralized DPA for firms of a certain size? Whatever the solution is, EU citizens will, for now, have to celebrate in vain some of the many accomplishments made in data regulation due to the GDPR as the pace of enforcement struggles to match the speed of big tech.
U.K. lawmakers recently announced a plan to set up a new Digital Market Unit (DMU) to regulate competition among online platforms. The plan stems from complaints that platform users of Facebook and Google, for example, are harmed by the limited choice they have when participating in these markets that are dominated by a handful of tech giants. Tech giants have gained incredible market share in these services by offering them for ‘free’ and then creating policies that work to entrench users in their web (no pun intended) and ward off third-party businesses because users are constructively blind to them. “The new code will set clear expectations for platforms that have considerable market power — known as strategic market status — over what represents acceptable behaviour when interacting with competitors and users," the Department of Digital, Culture, Media and Sport wrote in a press release. Commenting on the announcement of the DMU in a statement, digital secretary Oliver Dowden said: “I’m unashamedly pro-tech and the services of digital platforms are positively transforming the economy — bringing huge benefits to businesses, consumers and society. But there is growing consensus in the U.K. and abroad that the concentration of power among a small number of tech companies is curtailing growth of the sector, reducing innovation and having negative impacts on the people and businesses that rely on them. It’s time to address that and unleash a new age of tech growth.”