After the COVID-19 pandemic halted many asylum procedures across Europe, fresh technologies are now reviving these kinds of systems. By lie diagnosis tools tested at the border to a program for verifying documents and transcribes interviews, a wide range of technologies is being used by asylum applications. This article explores just how these solutions have reshaped the ways asylum procedures will be conducted. That reveals just how asylum seekers are transformed into required hindered techno-users: They are asked to comply with a series of techno-bureaucratic steps also to keep up with unpredictable tiny within criteria and deadlines. This kind of obstructs their particular capacity to navigate these devices and to follow their legal right for safeguard.
It also displays how these kinds of technologies are embedded in refugee governance: They accomplish the ‘circuits of financial-humanitarianism’ that function through a flutter of spread technological requirements. These requirements increase asylum seekers’ socio-legal precarity by simply hindering them from interacting with the programs of protection. It further argues that studies of securitization and victimization should be coupled with an insight into the disciplinary mechanisms these technologies, by which migrants will be turned into data-generating subjects so, who are self-disciplined by their reliance on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal know-how, the article argues that these technology have an natural obstructiveness. They have a double result: www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students even though they assistance to expedite the asylum method, they also produce it difficult with regards to refugees to navigate these systems. They are simply positioned in a ‘knowledge deficit’ that makes these people vulnerable to bogus decisions made by non-governmental actors, and ill-informed and unreliable narratives about their cases. Moreover, they pose new risks of’machine mistakes’ which may result in inaccurate or discriminatory outcomes.