{"id":23339,"date":"2022-11-16T17:14:15","date_gmt":"2022-11-16T20:14:15","guid":{"rendered":"https:\/\/ee02395c61.nxcli.io\/insights\/o-que-e-racismo-algoritmico-e-como-supera-lo\/"},"modified":"2024-03-26T16:35:06","modified_gmt":"2024-03-26T19:35:06","slug":"how-to-overcome-algorithmic-racism","status":"publish","type":"insights","link":"https:\/\/elogroup.com\/en\/insights\/how-to-overcome-algorithmic-racism\/","title":{"rendered":"What is algorithmic racism and how to overcome it?\u00a0"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"23339\" class=\"elementor elementor-23339 elementor-15633\" data-elementor-post-type=\"insights\">\n\t\t\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-685d7e97 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"685d7e97\" data-element_type=\"section\" data-e-type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-4786d4b4\" data-id=\"4786d4b4\" data-element_type=\"column\" data-e-type=\"column\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;}\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-2fb5b1d0 elementor-widget elementor-widget-text-editor\" data-id=\"2fb5b1d0\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><strong>By EloInsights<\/strong><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-aff0d8a elementor-widget-divider--view-line elementor-widget elementor-widget-divider\" data-id=\"aff0d8a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"divider.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-divider\">\n\t\t\t<span class=\"elementor-divider-separator\">\n\t\t\t\t\t\t<\/span>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-477b2baa elementor-widget elementor-widget-text-editor\" data-id=\"477b2baa\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<ul><li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"1\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;\uf0b7&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"1\" data-aria-level=\"1\"><i><span data-contrast=\"auto\">Technology is constituted and makes sense through people. For this reason, it is also capable of carrying biases and reproducing discrimination that affects everyone, including companies.<\/span><\/i><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/li><li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"1\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;\uf0b7&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"2\" data-aria-level=\"1\"><i><span data-contrast=\"auto\">Emerging inventions, such as artificial intelligence, already support and make various kinds of decisions. The danger lies in being guided by a false neutrality of technology, ignoring its social dimension.<\/span><\/i><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/li><li data-leveltext=\"\uf0b7\" data-font=\"Symbol\" data-listid=\"1\" data-list-defn-props=\"{&quot;335552541&quot;:1,&quot;335559685&quot;:720,&quot;335559991&quot;:360,&quot;469769226&quot;:&quot;Symbol&quot;,&quot;469769242&quot;:[8226],&quot;469777803&quot;:&quot;left&quot;,&quot;469777804&quot;:&quot;\uf0b7&quot;,&quot;469777815&quot;:&quot;hybridMultilevel&quot;}\" aria-setsize=\"-1\" data-aria-posinset=\"3\" data-aria-level=\"1\"><i><span data-contrast=\"auto\">Through practical examples, we explain how algorithmic racism affects people. Fighting it back involves building a corporate culture that respects and promotes diversity.<\/span><\/i><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/li><\/ul>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cd51481 elementor-widget-divider--view-line elementor-widget elementor-widget-divider\" data-id=\"cd51481\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"divider.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-divider\">\n\t\t\t<span class=\"elementor-divider-separator\">\n\t\t\t\t\t\t<\/span>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-53b4dec elementor-widget elementor-widget-spacer\" data-id=\"53b4dec\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"spacer.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-spacer\">\n\t\t\t<div class=\"elementor-spacer-inner\"><\/div>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1fcd1f00 elementor-widget elementor-widget-text-editor\" data-id=\"1fcd1f00\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">The use of facial recognition software may seem trivial to a white person. The technology is already embedded in various devices, like our own smartphones, and used in everyday activities, such as validating access to airports, schools, companies and even in the electronic gates of residential buildings. However, for a black person, or one belonging to other minority groups, this experience can be an aggressive reminder of the biases contained in our social structure, whether it is because their faces are not even detected or because their features can be mistaken for those of other people.\u00a0<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">This problem is partly explained by the lack of diversity in the images that feed databases used to train devices based on artificial intelligence (AI). The low variety of phenotypes (skin color, hair textures, eye shape and color, etc.), lead to a path in which this type of technology has a much greater chance of being confused when comparing, for example, the faces of two black people, than it does of mistakenly identifying two white people.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Situations like these are part of what experts call \u201calgorithmic racism\u201d. This phenomenon refers to the unfolding of our unconscious biases in algorithms that order the functioning of devices and end up reproducing oppressions in digital environments. The result is that various emerging technologies are subject to reproducing stereotypes and asymmetries embedded in the collective imagination throughout history.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">\u201cI define algorithmic racism as the way in which the current arrangement of technologies, social and technical imaginaries strengthen the racialized ordering of knowledge, resources, space and violence to the detriment of non-white groups\u201d, explains Tarcizio Silva, a researcher associated with the Mozilla Foundation and author of the book <\/span><i><span data-contrast=\"auto\">Racismo algor\u00edtmico: intelig\u00eancia artificial e discrimina\u00e7\u00e3o nas redes digitais<\/span><\/i><span data-contrast=\"auto\"> (<\/span><i><span data-contrast=\"auto\">A<\/span><\/i><span data-contrast=\"auto\">lgorithmic racism: Artificial Intelligence and Discrimination in Digital Networks, in a free translation to English)<\/span><span data-contrast=\"auto\">.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">For Silva, the main problem is not in the lines of code, but in reinforcing and favoring the reproduction of designs of power and oppression that are already in place in the world. The implementation of digital technologies is not exclusively technical in nature and, therefore, cannot be uncritical. It also has social implications, which must be addressed by companies.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Pedro Guilherme Ferreira, Analytics Director at EloGroup, points to the need of understanding the debate\u2019s level of criticality, not just the potential damage to the companies\u2019 reputations: \u201cRacism is a criminal offence and needs to be combated. Racism in algorithms should be no different. Bias in the data is contained in this universe and, ultimately, can cause great damage\u201d.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">In this article, we will mainly discuss algorithmic racism, but any kind of bias or discrimination interferes with the relationship between people and technology, two central elements in the digital transformation process that organizations are going through. Especially when the social dimension gains relevance within ESG agendas. By looking at the \u201cS\u201d in the acronym, we can make a direct link between the theme of this article and the construction of an open corporate culture, capable of engaging stakeholders around strategic goals, without neglecting accountability for possible negative effects on society.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">\u201cDifferent points of view are constructed because of different experiences based on our social markers. In places where there are different perspectives, people will look at the same situation through a different lens\u201d, says Gabriel Lev\u00ed, Diversity and Inclusion (D&amp;I) leader at EloGroup. \u201cThe sum of these various views complements each other and forms a vision that is much closer to reality. This is the immense value of diversity when we think about decision-making&#8221;.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5ebb358 elementor-widget elementor-widget-image\" data-id=\"5ebb358\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"800\" height=\"534\" src=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3669-1024x683.jpg\" class=\"attachment-large size-large wp-image-20878\" alt=\"\" srcset=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3669-1024x683.jpg 1024w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3669-300x200.jpg 300w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3669-768x512.jpg 768w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3669.jpg 1280w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c666928 elementor-widget elementor-widget-text-editor\" data-id=\"c666928\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">Building a more plural, sustainable and profitable future involves promoting inclusion, equity and diversity in the corporate environment. The generation of value in business is maximized in truly inclusive and diverse environments. Even more so because of the transformative and experimental nature of enabling technologies, such as artificial intelligence and its use in conjunction with algorithm programming.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Furthermore, combating technological bias in organizations is part of caring for a capital that is both human and economic, as it aims to respect people, be they shareholders, employees or customers.\u00a0<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Before we delve into ways of mitigating these biases, let\u2019s understand how it happens.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fd98817 elementor-widget elementor-widget-heading\" data-id=\"fd98817\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">How and why there is bias in technology<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3ed6291b elementor-widget elementor-widget-text-editor\" data-id=\"3ed6291b\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">Among many of his works, the specialist Tarcizio Silva keeps a <\/span><span style=\"text-decoration: underline;\"><span style=\"color: #0000ff; text-decoration: underline;\"><a style=\"color: #0000ff; text-decoration: underline;\" href=\"https:\/\/tarciziosilva.com.br\/blog\/destaques\/posts\/racismo-algoritmico-linha-do-tempo\/\" target=\"_blank\" rel=\"noopener\">timeline<\/a><\/span><\/span><span data-contrast=\"auto\"> that catalogues how often digital media, social media and AI-based devices reinforce racist bias in society.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">The starting point of this collection is 2010, when a feature to prevent selfies with closed eyes, present in Nikon cameras, was confused with the eyes of Asian people. It goes on to other emblematic cases that caused outrage, such as one in 2015, when black people were dehumanized and tagged as \u201cgorillas\u201d by one of Google\u2019s tools. And it goes right up to the present day, in 2022, with complaints such as the iPhone not being able to register the faces of people with traditional M\u0101ori tattoos; or a start-up that developed software that modifies different accents to sound like the standard white American; or even the facial recognition system of a banking app that could not identify the face of a black account holder.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">As well as being vast, this documentation reinforces that understanding the biases contained in algorithmic systems is not just about analyzing the structure of codes, nor is it about considering such cases as isolated or specific uses. \u201cIt involves identifying which behaviors are normalized, which data they accept, which types of error are or are not considered between system inputs and outputs, their potential for transparency or opacity and for which presences or absences the systems are implemented. In short, to analyze the networks of political-racial relations in technology\u201d, states Silva.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Another researcher who is sharpening her gaze on artificial intelligence is Nina da Hora, who defines herself as a scientist in the making, an anti-racist and decolonial hacker. In her view, \u201cthe racial bias comes from us, who feed the algorithm. It was not born with the algorithm; it was born with our society. And it is exceedingly difficult to identify at what stage the negative results begin\u201d.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">This statement was made in an episode of journalist Malu Gaspar\u2019s podcast. In another extract, she explains how algorithmic racism happens by differentiating between image recognition and facial recognition.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">In her words, image recognition often uses algorithms from machine learning trained to recognize important points in the image of a face or an object. From this, there are possible interpretations and decision making. This is the case with social networks, which have pre-collected and organized image bases, such as Facebook, which automatically tags people in a photo.\u00a0<\/span><span data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Facial recognition, on the other hand, usually takes place in real time and is a technology focused on the face. It starts with the eyes and works its way down, looking for expressive marks to identify a person. It also uses algorithm training, in which there is a base of images through which it will make the match, saying whether or not the person corresponds with who is shown in an image collected, for example, by a security camera. Human intervention takes place in both technologies and throughout the collection and analysis process.<\/span><span data-ccp-props=\"{&quot;134233117&quot;:false,&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559737&quot;:0,&quot;335559738&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">For Pedro Guilherme Ferreira, facial recognition tends to be more accurate precisely because it focuses on the geometry of the face. \u201cThis technology takes more account of facial features and less of social characteristics. Image recognition is most often done by association via unsupervised learning and can therefore hold more biases, such as racism\u201d, explains the director.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">The accuracy of these technologies fails due to multiple factors. The bias can be both in the way the image is captured and in the construction of the logic on which the collected image is analyzed. Without a sufficient variety of faces and phenotypes, a predictive model applied to public safety can misidentify a mark of expression and point to an innocent person as guilty, for example. But not only the image bases used to combat crime make negative associations with the characteristics of minority groups, such as black, Latino, Asian and other non-white identities.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">When recognizing images in search engines on digital platforms \u2013 which are widely used for services and products, for example \u2013, the hair texture with black features is identified with pejorative content and associated with negative terms, recalls Da Hora. All of these social stigmas contribute to the identification of white people being more efficient, accurate and with a more positive bias.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">The documentary <\/span><span style=\"text-decoration: underline;\"><span style=\"color: #0000ff;\"><a style=\"color: #0000ff;\" href=\"https:\/\/www.codedbias.com\/about\" target=\"_blank\" rel=\"noopener\"><i>Coded Bias<\/i><\/a><\/span><\/span><span data-contrast=\"auto\">, by filmmaker Shalini Katayya, brings other dimensions to the problem, with layers of experimentation. Researcher Joy Buolamwini, PhD and creator of the Coded Gaze and Algorithmic Justice League projects, recounts how an art project using computer vision at the MIT (Massachusetts Institute of Technology) lab made her change direction and study various facial recognition platforms in depth.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Her first idea was to create a mirror that would work on self-esteem and inspiration with the effect of superimposing the faces of ordinary people onto those of personalities, such as tennis player Serena Williams. However, tests with the software used to make the project viable did not work. At this point, the documentary shows scenes similar to those described so far.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">When in front of the machine, Buolamwini\u2019s face is simply not recognized. The reading was only possible when, out of curiosity, she decided to put on one of the lab\u2019s costumes: a completely white mask. When she puts it on, the program recognizes her face, but when she takes it off, her real face is not detected. When it analyzed the parameters of the technology, it found that the database was mainly sourced from images of white-skinned people. As a result, the system did not learn to recognize faces like hers, of a black woman.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-322ff9d elementor-widget elementor-widget-image\" data-id=\"322ff9d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"800\" height=\"450\" src=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-coded-bias-2_JoyInComputer-1024x576.jpg\" class=\"attachment-large size-large wp-image-20871\" alt=\"\" srcset=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-coded-bias-2_JoyInComputer-1024x576.jpg 1024w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-coded-bias-2_JoyInComputer-300x169.jpg 300w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-coded-bias-2_JoyInComputer-768x432.jpg 768w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-coded-bias-2_JoyInComputer.jpg 1500w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">Researcher Joy Buolamwini removes a white mask from her face in one of the scenes from the documentary Coded Bias<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3e73c45 elementor-widget elementor-widget-text-editor\" data-id=\"3e73c45\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">The researcher then decided to extend her investigation to other platforms and found that algorithms from giants like Microsoft, IBM and Google performed better with male faces compared to female faces. They also had better results when detecting faces with lighter skin tones, compared to darker skin tones. The big techs were called in and improved their systems to correct the flaws. The crux of the matter, however, which is the bias in technology, is far from a solution.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">\u201cAI is geared towards the future, but it is based on data and that data reflects our history. The past is imprinted in our algorithms\u201d, highlights Buolamwini.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">In another passage from <\/span><i><span data-contrast=\"auto\">Coded Bias<\/span><\/i><span data-contrast=\"auto\">, Cathy O\u2019Neal, PhD and author of the book <\/span><i><span data-contrast=\"auto\">Weapons of Math Distruction<\/span><\/i><span data-contrast=\"auto\">, reinforces the concern about how AI could further affect our lives if inaccurate algorithms continue to be inserted into our daily lives. Even loaded with biased readings of the past, they are already used to answer questions such as \u201cwill this person pay back this loan?\u201d or \u201cwill they be fired from their job?\u201d.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">With the participation of many other experts, the documentary points out that machine learning is still not fully understood, and its development is still restricted to a hegemonic and not truly diverse group of people, as it requires a high degree of technical knowledge.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Traditionally, the construction of a code resembles a list of instructions, in which the programmer dictates the rules, and the machine executes. With the evolution of artificial intelligence technologies, coupled with the spread of social media and the massive production of data on the web, machines are now able to learn by interpreting and decoding a mass of information, data sets. So, the algorithm gains autonomy and ends up with a \u201cmargin of maneuver\u201d that is beyond the control of the developers. For all these reasons, reflection is urgent and there are many ways to act against biases in technology.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d2ce319 elementor-widget elementor-widget-heading\" data-id=\"d2ce319\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">How to overcome algorithmic racism <\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c29acb3 elementor-widget elementor-widget-text-editor\" data-id=\"c29acb3\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">The world of technology is becoming increasingly inseparable from the social context. Whether it is in everyday tasks like giving voice commands to a device, using filters on social media or, even, in a slightly more veiled way, in systems that determine if you will have access to a university place and bank credit. With this growing influence on all kinds of decision-making, we need to understand how the social sphere is affected.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Algorithmic racism is part of the issue of bias in technology and its solution is as complex as learning how emerging technologies work, especially AI.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Living with diversity and always being vigilant are essential in Pedro Ferreira\u2019s view. \u201cIf you have a heterogeneous team, that is already a good start. Also be careful with the data you are using. Flawed data will probably generate flawed algorithms. Finally, it is particularly important to consider the problem of causality by avoiding spurious relationships. New areas within AI, such as causal modelling, are beginning to shed light on solving these problems\u201d, says the director of EloGroup.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Gabriel Lev\u00ed expands the perspective beyond technical development: \u201cIt is also a question of how people can access opportunities to demonstrate their value\u201d. The D&amp;I specialist gives some examples of actions, from the most complex to the simplest:<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">A cosmetics giant partnered with a school specializing in developers to train its technological staff within a diversity framework. At the end of the nine-month process, some of the professionals were hired. The rest took part in a fair and were placed in technology vacancies at other companies.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Smaller actions in selection processes can revolve around mentoring for people from minority groups. In this case, these employees undergo 2 or 3 months of training before applying for part of the jobs. There is also the possibility of reserving a percentage of vacancies, say 50%, for a specific group of candidates.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Although crucial, the entry point is not the central of attention for companies. It is essential to create the conditions for these people, within their diversity, to be future decision-makers. \u201cThis is how they can have an impact, for example, on the logic behind an algorithm. They need to occupy positions of power. Establishing a competency framework helps you understand what the next step is, what skills are needed for that group of people to get there. Besides, you must give people all the support they need to develop\u201d, adds Lev\u00ed.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">The conclusion is that there is no exact step-by-step or list of definitive actions. Often, biases in behavior or judgement are the result of subtle cognitive processes and occur at a level below a person\u2019s conscious perception. This is what Jana\u00edna Gama, senior D&amp;I consultant at Mais Diversidade, teaches us. This means that making decisions unconsciously is a human tendency and naturally has an impact on companies.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">\u201cWe categories reality into our own understanding. We memories the world based on labels and tags\u201d, said Gama at a training session promoted by the EloGroup+ Academy, an initiative of EloGroup\u2019s D&amp;I program. She also emphasized the need to take responsibility and have a firm stance: \u201cYou cannot use unconscious bias as a justification for prejudice, racism and homophobia. You must make a commitment to diversity\u201d.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e05a51e elementor-widget elementor-widget-image\" data-id=\"e05a51e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"800\" height=\"536\" src=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3688-1024x686.jpg\" class=\"attachment-large size-large wp-image-20873\" alt=\"\" srcset=\"https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3688-1024x686.jpg 1024w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3688-300x201.jpg 300w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3688-768x515.jpg 768w, https:\/\/elogroup.com\/wp-content\/uploads\/2022\/11\/racismo-algoritmico-DTS-Identity-Alex-Tan-3688.jpg 1280w\" sizes=\"(max-width: 800px) 100vw, 800px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cefa3db elementor-widget elementor-widget-text-editor\" data-id=\"cefa3db\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span data-contrast=\"auto\">Because of this \u201csubtle\u201d facet, there are still those who deny the existence of racism, which in itself is a huge barrier to overcoming it. However, it does exist, and it is there in the delegitimisation of knowledge and in the blocking of aesthetic and cultural freedoms of social groups.\u00a0<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Tarcizio Silva uses the concept of \u201cmicroaggressions\u201d to draw attention to the overwhelming amount of aggression that black people face on a daily basis. The term was coined by psychiatrist Chester Pierce between the 1960s and 1970s and refers not to the intensity, but to the high incidence of offences.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">\u201cWhen a black citizen, in her daily interaction with social media and digital systems, has to face problems linked to racism, such as unfair credit scores, differential pricing of services due to her location, or filters that distort selfies, we have constant violence that must be understood and combated\u201d, says the researcher.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">In this sense, the fight can be centered on what Silva calls \u201cdouble opacity\u201d, or the way in which hegemonic groups seek to both present the false idea of neutrality in technology and strive to hinder debate on serious violations, such as racism and white supremacy. \u201cThe irresponsible use of supposedly natural databases for training, without full filtering or curation, promotes the worst in society when it comes to data collection. And the layers of opacity in the production of models, implementation and adjustments are defended by large companies, in terms of cost-benefit and business secrecy\u201d, he exemplifies.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">Lev\u00ed uses a phrase from the jurist and philosopher Silvio de Almeida to explain how racism is rooted in society: \u201c&#8217;Racism happens when things are normal\u2019. People have the false idea that racism is outside the norm, but it is a <\/span><i><span data-contrast=\"auto\">status quo<\/span><\/i><span data-contrast=\"auto\">. Thinking about specific actions for each type of group is the remarkable thing about diversity. When developing a product, an app or training, you must understand who the people are, what they want to hear and what they have to say\u201d.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p><p><span data-contrast=\"auto\">At an organizational level, we need to build an open corporate culture, capable of fostering a sense of responsibility in those who develop and use digital assets. Inclusive leadership begins with awareness of our own biases. We need a broad understanding that there are diverse people and that not all social groups are truly included in the market. Affirmative action, in the medium and long term, must be able to bring diversity to the most strategic layers of the organization, influencing more decision-making, diversifying sources of information and promoting moments for everyone to share their stories. This leads to the creation of a corporate environment in which there is psychological security and in which everyone can contribute with high efficiency and creativity.<\/span><span data-ccp-props=\"{&quot;134233118&quot;:false,&quot;201341983&quot;:0,&quot;335551550&quot;:6,&quot;335551620&quot;:6,&quot;335559685&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:360}\">\u00a0<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Technology is constituted and makes sense through people. For this reason, it is also capable of carrying biases and reproducing discrimination that affects everyone, including companies.<\/p>\n","protected":false},"author":9,"featured_media":20870,"parent":0,"template":"","editorias":[140],"industrias-category":[130,142],"praticas-category":[127,154,157],"insights-category":[162],"class_list":["post-23339","insights","type-insights","status-publish","has-post-thumbnail","hentry","editorias-unlock-the-good-en","industrias-category-education","industrias-category-government","praticas-category-ai-algorithms","praticas-category-esg-en","praticas-category-people-organizations","insights-category-unlock-the-good-en"],"_links":{"self":[{"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/insights\/23339","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/insights"}],"about":[{"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/types\/insights"}],"author":[{"embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/users\/9"}],"version-history":[{"count":5,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/insights\/23339\/revisions"}],"predecessor-version":[{"id":23345,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/insights\/23339\/revisions\/23345"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/media\/20870"}],"wp:attachment":[{"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/media?parent=23339"}],"wp:term":[{"taxonomy":"editorias","embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/editorias?post=23339"},{"taxonomy":"industrias-category","embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/industrias-category?post=23339"},{"taxonomy":"praticas-category","embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/praticas-category?post=23339"},{"taxonomy":"insights-category","embeddable":true,"href":"https:\/\/elogroup.com\/en\/wp-json\/wp\/v2\/insights-category?post=23339"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}