After the COVID-19 pandemic stopped many asylum procedures throughout Europe, fresh technologies are now reviving these kinds of systems. Right from lie detection tools examined at the border to a system for verifying documents and transcribes interviews, a wide range of solutions is being utilized for asylum applications. This article is exploring how these technologies have reshaped the ways asylum procedures are conducted. It reveals just how asylum seekers will be transformed into forced hindered techno-users: They are asked to adhere to a series of techno-bureaucratic steps also to keep up with unpredictable tiny within criteria and deadlines. This obstructs all their capacity to browse through these devices and to go after their legal right for safety.
It also shows how these types of technologies happen to be embedded in refugee governance: They help in the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by simply hindering them from opening the programs of proper protection. It further argues that analyses of securitization and victimization should be coupled with an insight in the disciplinary mechanisms these technologies, by which migrants are turned into data-generating subjects exactly who are disciplined by their reliability on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal know-how, the article argues that these technologies have an natural obstructiveness. They have a double result: www.ascella-llc.com/portals-of-the-board-of-directors-for-advising-migrant-workers although they assist to expedite the asylum process, they also make it difficult with regards to refugees to navigate these types of systems. They are simply positioned in a ‘knowledge deficit’ that makes all of them vulnerable to illegitimate decisions manufactured by non-governmental celebrities, and ill-informed and unreliable narratives about their cases. Moreover, they pose new risks of’machine mistakes’ which may result in inaccurate or discriminatory outcomes.