Hostname: page-component-cd9895bd7-mkpzs Total loading time: 0 Render date: 2024-12-23T06:22:43.011Z Has data issue: false hasContentIssue false

Species ex machina: ‘the crush’ of animal data in AI

Published online by Cambridge University Press:  12 October 2023

Simon Michael Taylor*
Affiliation:
School of Regulation and Global Governance, Australian National University, Australia
Rights & Permissions [Opens in a new window]

Abstract

A canonical genealogy of artificial intelligence must include technologies and data being built with, for and from animals. Animal identification using forms of electronic monitoring and digital management began in the 1970s. Early data innovations comprised RFID tags and transponders that were followed by digital imaging and computer vision. Initially applied in the 1980s for agribusiness to identify meat products and to classify biosecurity data for animal health, yet computer vision is interlaced in subtler ways with commercial pattern recognition systems to monitor and track people in public spaces. As such this paper explores a set of managerial projects in Australian agriculture connected to computer vision and machine learning tools that contribute to dual-use. Herein, ‘the cattle crush’ is positioned as a pivotal space for animal bodies to be interrogated by AI imaging, digitization and data transformation with forms of computational and statistical analysis. By disentangling the kludge of numbering, imaging and classifying within precision agriculture the paper highlights a computational transference of techniques between species, institutional settings and domains that is relevant to regulatory considerations for AI development. The paper posits how a significant sector of data innovation – concerning uses on animals – may tend to evade some level of regulatory and ethical scrutiny afforded to human spaces and settings, and as such afford optimisation of these systems beyond our recognition.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (https://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is unaltered and is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use or in order to create a derivative work.
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press on behalf of British Society for the History of Science

Data is everywhere, but who can harvest it?Footnote 1

A year or so before the first known case of the interspecies mad cow disease or bovine spongiform encephalopathy (BSE) was discovered in British cattle in April 1985, I witnessed cattle herded into a chute known as a ‘cattle crush’.Footnote 2 This was in gum-tree-logging country of East Gippsland, Australia (land repurposed from indigenous nations of Bidewell, Yuin, Gunnaikurnai and the Monero or Ngarigo people).Footnote 3 As a child, I watched cattle with floppy ear tags and flexing muscles being led up a narrow chute. Their heads were secured by a neck yoke. Pneumatic constraints aligned the animal bodies to restrict their movements. Each animal was inspected by hand, assessed for a unique ear-tag code, and positioned for inoculations. I recall the laborious nature of these tasks – for animals and humans – but did not comprehend how the architecture, leading into iron-slatted trucks, bound cattle for slaughter. The ‘crush’ has not only informed a supply chain of market produce, but is an infrastructural component of recording data, and, as argued here, contributes to developing AI systems.Footnote 4

Equivalents to the cattle crush, such as the French shoeing trevis (a loop for hoofing animals), have been central to animal husbandry and herd management across multiple cultures. From the 1980s, the ‘crush’ was incrementally modified to incorporate computerized processes for biosecurity, vaccinations, electronic tagging and digital meat assessment.Footnote 5 In the United States, Temple Grandin led a majority of industrial designs for cattle crushes and animal-handling environments during this era. Aiming to decrease stress on animals from intrusive and unnecessary manual handling, she designed ‘crushes’ for over one-third of cattle and hogs processed across the country.Footnote 6 In the Netherlands in 1983 experiments with visual imaging analysis (VIA) utilized digital video as a grading device for meat quality.Footnote 7 The objective was to assist farmers to better visualize and calculate yield on animals for commercial purposes. The crush abetted uses of electronic sensors, digital logging and automated algorithms to optimize architectural and informational corridors leading cattle to slaughter. Experimental data captured from live animals was analysed across image generation, digital processing and graphical interpretation.Footnote 8 In 1991 Robin D. Tillet at the Robotics and Control Group of the Siloe Research Institute provided the first overview of agricultural innovations and challenges in the use of visual computer analysis.Footnote 9 Infrastructure built around the legacy of the ‘crush’ for animal identification and the datafication of farm produce in turn contributes to other AI systems. The ‘crush’ not only is a pivotal component into historical, architectural and informational innovations (as seen in Figure 1) in a competitive agri-food supply chain, but is characterized by ‘third-party’ artificial-intelligence (AI) products and data companies fuelling different interests.Footnote 10 This includes undertakings to improve monitoring of vehicles and people in airports or on train stations. As this paper argues, key technologies and data shaped in the agricultural sector may transfer across domains, species, networks and institutions.

Figure 1. A layout for corralling livestock in Australia and South America, including a squeeze or ‘cattle crush’ as part of the electronic sorting and computerized systems. Reprinted from Temple Grandin, ‘The design and construction of facilities for handling cattle’, Livestock Production Science (1997) 49(2), pp. 103–19, 112, with permission from Elsevier.

Today in Australia, off-the-shelf Microsoft Kinect cameras (originally developed by PrimeSense for use in gaming consoles) are now deployed in the crush to build a three-dimensional data model of individual cattle. Using open-source software development kits (SDKs) and computer vision tools, farmers visualize animal bodies aligned within the architecture of the crush. A form of infrared projection mapping, a pattern of light spots is projected onto any object – here, a cow's body – so that a computer vision unit can detect the illuminated region. Any shift in patterns received, relative to a static reference capture, helps display the shapes in real time and render them as a contoured three-dimensional image. By reconstructing light as a three-dimensional data point cloud, a visual map of the interior body volume can be configured.Footnote 11 This digitized process is similar to other ‘under-the-skin’ techniques, like ultrasonography, neuroimaging and computed tomography. But this one is designed specifically for low-cost adoption, digital storage and fast analytic capacity, motived by agricultural profit margins.

Such imaging techniques afford ‘a digitized dissection’, displaying the animal in statistical curve signatures used to quantify meat products with machine learning. Algorithms convert the curves into computational screen segments classed as volumetric image proportions. This approach helps to calculate data upon precise meat and fat proportions that can be correlated against real-time estimates of export market prices. Furthermore, these statistics can train autonomous robotic systems to slice each carcass into portions that maximize the yield.Footnote 12 Such data is also proving noteworthy to facilitate sales by digital currency for the first time. In 2023, FarmGate, an Australian rural digital payments platform, has allowed accredited vendors to self-assess livestock for sale without a human agent. The platform allows buyers to trace progeny and yield rates based on visualized marble scores that connect and convert easily ‘into new, distributed software platforms and flicking between trading systems’.Footnote 13

As modes of visualization, classification and data analysis, these techniques expand scientific traditions of what Knorr-Cetina and Amann call ‘image arithmetic’ to derive optimum solutions for any visual phenomenon: in this case ‘exploiting calculable properties of visual marks … to derive inferences from image features’.Footnote 14 Yet such data and digital infrastructure do not remain with animals. Thanks to machine learning, digital networks and the data transformations they afford, information to represent objects, events, animals and individuals is distributed. In many cases the origins and owners of this data do not have control over how this is shared, sold on, transformed and duplicated for other purposes.

Historians have done a great deal to show that over time the management of animals and their associated food and agricultural products has relied on diverse techniques of industrialization, identification and product traceability. These have ranged across the trade in animals marked by tags, administrative documentation, certification and branding practices in diverse societies and extend to environmental management techniques that monitor wild species.Footnote 15 Entwined with histories of domestication, anatomical dissection, technical instrumentation, material containment, environmental disruption and empire building are vast genealogies of data. The global development of industrial agriculture enabled the birth of what some agriculturalists called ‘a new information stage’.Footnote 16 By leveraging forms of trade with digital interfacing, this agricultural infrastructure to manage animals has also enabled the constitution of AI systems.

As yet, we do not have a comprehensive history of AI and computer vision systems built in cattle agriculture. This paper offers a preliminary outline of some key developments, starting with electronic identification systems in 1970s and 1980s imaging experiments. This is to build an understanding of the intensified computational analysis of animals and the cross-domain repurposing of AI and data surveillance.Footnote 17 To be specific, my empirical focus on computer vision techniques in the ‘crush’ reveals how consumer-grade video devices can be customized with open-source software amenable to statistical analysis and machine learning. This relies upon animals freely available for data extraction and bodily manipulation.Footnote 18 In the use of machine learning to identify and grade living cattle, the paper further exposes how this innovation builds on training from a 1988 anthropometric military data set. Outlining the series of scientific and institutional connections that utilize this menagerie of data exposes cross-domain purposes allowing all bodies to be sorted, categorized and transformed. The objective is to better understand how techniques of AI are entangled with multiple uses. Doing so requires this cross-species approach, to reveal what Peter Linebaugh described ‘as the many-sided oppositions of living labour brewing within and among modes of technical AI’.Footnote 19

Systems of dual use

It is understandable that the possibility of capturing precise information with computer vision correlated to market prices in real time has seduced the imagination of farmers and data scientists alike.Footnote 20 An emerging ‘precision agriculture’ has recently been touted as a way to tackle climate change and population growth, and to assist farmers to optimize decision making across the supply chain.Footnote 21 For example, in 2016 the national organization for meat and livestock management, Meat and Livestock Australia (MLA), publicly advocated the use of computer vision tools in agricultural management. ‘The technology is simple and robust’, they claimed, following trials conducted by the University of Technology Sydney Centre for Autonomous Systems. Working with Angus cattle on farms and saleyards around Armidale and Grafton in New South Wales, MLA promoted cameras being part of everyday farm management practices: they are ‘easy to operate and source; the sophisticated computer technology does the rest’.Footnote 22 As historians are deeply aware, different elements in supply chains have been modularized, adapted, linked and repurposed across different eras and geographical spaces, and in different networks.

As scholars Simon Coghlan and Christine Parker have argued, at least since Ruth Harrison's Animal Machines (1964), the radical industrialization of farming has produced technical, high-density and mostly automated ways to turn non-humans into quantified objects.Footnote 23 Considering a seven-hundred-year history of domesticating canine breeds, Edmund Russell postulated that ‘animals are technology’ and that industrial and scientific interventions into animal lives have co-evolved with human beings.Footnote 24 Ann Norton Greene explains this coevolution in her history of horses being put to work and functioning as hybridised military instruments during the American Civil War.Footnote 25 She and other scholars have used the concept of Enviro-tech as a frame of interpretation that includes positioning animals as integral to the creation of modern technology and calls for a more entwined or distributed view of the intersections between nature, agriculture, humans, animals and technology than historians have traditionally sought to attend to.Footnote 26 By examining both how animals can be technologies, and the uses of technology upon animals, we may extend Donna Haraway's insistence that animals bring specific solidities into the apparatus of technological production, and herein can be viewed as sites of technical amalgamation in AI systems, and hybrid data components that may shape other uses.Footnote 27

Taking enviro-tech studies as a locus of interpretation, the crush becomes a pivot point in this paper to draw together computerized designs and ancillary data from animal identification into the frame of AI and the modularized componentry it requires to operate. On the one hand this emphasizes material technologies that enable techniques of machine learning to be brought into the farm. Second, it complements Etienne Benson's account of cattle guards in the United States, with the crush as ‘a constructed space of encounter where bodies of machines, animals, and humans weave complex paths around each other’.Footnote 28 This includes infrastructure to visually datify and anatomically dissect hundreds, if not thousands, of distinct economic products from animal bodies.Footnote 29 On the other hand, by tracing different empirical and scientific experiments conducted over global research contexts, potential problems are highlighted when pioneering AI techniques across domains and for use in commercially significant and globalized data contexts. It is perhaps due to the commercial proliferation and duplication of AI and machine learning models, that repurposed training datasets and common algorithmic decision-making modules allow ‘harms that could have been relatively limited or circumscribed’ to be scaled-up from techniques on animals to build other uses very quickly.Footnote 30

As Herbert Simon, a conceptual pioneer in the AI field, stated, ‘cattle are artifacts of our ingenuity … adapted to human goals and purposes’.Footnote 31 Yet if cattle and ‘the crush’ have helped optimize computational methods, as argued here, then Amoore and Hall's complementary account of computer visualization in airport border security as ‘digital dissection’ shows a modular constitutive element that may speed up automated technical decision making on any body.Footnote 32

Most computer scientists in the AI field cannot forecast how code will operate elsewhere, such as the values it brings, ‘where it might have a “dual use” in civilian/military sectors’.Footnote 33 As imaging has improved, computer algorithms can be customized and built to detect varied shapes. If the internal structure of animal bodies enables digitization of shape and data, this comparison is then distributed across machine-learning pipelines, networked systems and modelling comparisons and integrated into market supply and human-tracking purposes.Footnote 34

The concept of ‘dual use’ is where an intended use or primary purpose for technology which is ‘proposed as good (or at least not bad) leads to a secondary purpose which is bad and was not intended by those who developed the technology in the first place’.Footnote 35 Kelly Bronson and Phoebe Stengers suggest that agribusinesses may avoid the degree of public activism raised in opposition to big data and algorithmic decision making in human social systems, as despite innovating with similar machinery, resources and labour to assist in collecting data and translating into efforts to strengthen market positions, agribusinesses are not yet read as handling sensitive data or training bodily surveillance systems.Footnote 36 As agronomic data is captured and structured for digitization, much is also rendered reproducible, exportable and interoperable through vast ‘networked arrays that capture, record, and process information on a mass scale … and that require significant capital resources to develop and to maintain’.Footnote 37 This paper therefore poses questions casting across shifting data infrastructures and into empirical substantiations where such a viewpoint intersects with regulatory concerns, data sovereignty and ‘dual-use’ implications.Footnote 38 The next section characterizes how these relationships were initially forged in international research communities of the 1970s and 1980s that brought electronic animal identification systems together for biosecurity purposes, and started utilizing computer recognition experiments.Footnote 39

The ‘crush’ of electronic and computed animal identification

On 8 and 9 April 1976, the first international symposium on Cow Identification Systems and their Applications was held at Wageningen, the Netherlands, presenting results from British, German, US and Dutch farming practices, research institutes and private companies.Footnote 40 Food logistics companies such as Alfa Laval (notable for early milking machines and their 1894 centrifugal milk–cream separator) and the Dutch company Nedap displayed early inductively powered identification systems.Footnote 41 Nedap exhibited wearable electronic transponders, collars attached around a cow's neck, that were read through a livestock ‘interrogation corridor’ (by 1997 nearly two million units had been sold worldwide).Footnote 42 Radio frequency microchip circuits called RFID tags were developed for use on cattle following the invention of short-range radio telemetry at Los Alamos in 1975, with the support of the Atomic Energy Commission and the Animal and Plant Health Inspection Service of the US Department of Agriculture.Footnote 43 RFID tags were not used merely to track cattle ranging in open fields but to read their identity as a tagged object passing within the range of a microwave antenna device.Footnote 44 The crush offered a convenient interrogation site. This technology found similar uses to log motor vehicles at toll booths, and to record package confirmation at supermarkets, post offices or shipping terminals. A primary aim in each case was to identify and to obtain product data speedily for real-time economic processing. Since the RFID tag could also store information on the age, sex or breeding of the animal and connect the owner, farm and vaccination status, this facility fed into data administration that raised the possibility of integrating different elements in real time for financial management.Footnote 45 From 1984, the emergence of mad cow disease in Britain led international regulators to call for improved traceability and information on meat product origins. Tasks were to record and indicate in which countries animals were born, reared, slaughtered and cut (or deboned), including an identifiable reference for traceability. Prior to mad cow disease, much product information would not go further than the slaughter stage.Footnote 46 Monitoring each animal on data from progeny to real-time abattoir processing reduced labour constraints is considered a ‘critical success factor’ to the economic decision making of scaling an agri-business.Footnote 47

That traceability of animals and meat products was an international security concern became recognized in the first International Standardization Organization meeting about animal identification products in 1991.Footnote 48 To ensure interoperability and design compatibility between different manufacturers, ‘some millions of devices’ were tested in Europe; problems with tampering led to a recommendation that ‘devices should be fixed to an animal’.Footnote 49 This challenge to register each animal via electronic devices became a significant factor to drive all sorts of prototyping. First-generation devices, such as RFID tags and transponders, were modified to measure body temperature and heart rate for animal fertility and health. In a 1999 special issue of Computers and Electronics in Agriculture, Wim Rossing of the Wageningen University Institute of Agricultural Engineering provided a survey devoted to electronic identification methods in this period. The use of ingestible electronic oral transponders and epidermis injectables showed many adopted from medical sensing devices. Many devices attached or injected into animals entailed some stress and perhaps pain, and often required bodily restraint ‘in a crush’.Footnote 50 Unobtrusive methods were sought that would likely influence behaviour and health less than attaching transponders, electronic collars or ingestibles. The search for a method to identify each specimen reliably and permanently with no adverse effects on animals led to an increase in remote biometric modelling. Important to this discussion is how the use of a stanchion (a prototypical ‘crush’) was critical to obtain cow noseprints in 1922; such identification methods regained traction in the rise of remote digital imaging and computational biometric modelling from 1991.Footnote 51 Resultant experiments produced material ingenuity and information traded between species, sectors, databases and mathematic models.Footnote 52

In the digital prototyping and algorithmic assessment of animals that followed, many sought to adopt computer vision and AI pattern recognition experiments on static objects and human beings prevalent from the 1980s. These involved algorithms that improved object recognition performance from curves, straight lines, shape descriptions and structural models of shapes.Footnote 53 In 1993, mathematical algorithms began to isolate, segment, and decipher the different shapes found in animal bodies specifically at the compositional boundary between fat and meat.Footnote 54 Computer vision methods adopted technology from medicine, anatomy and physiology. For example, magnetic resonance imaging (MRI) was used to determine composition in living animals with ‘high contrast, cross-sectional images of any desired plane … to measure volumes of muscles and organs’, but became cost-prohibitive for large-scale mass-market adoption.Footnote 55 Similarly, the US Food Safety and Quality Service (FSQS) collaborated with NASA's Jet Propulsion Laboratory to aid instrumental assessment of beef using ultrasound compared to remote video analysis from 1994 to 2003.Footnote 56 However, using computer algorithms in combination with digital video was considered more viable. This was due to cost and to 1980s and 1990s research on butchery that showed an ability to efficiently separate and segment bodies into digital shapes to enhance robotic butchery operations.Footnote 57 As digital shape and imaging analysis spread, benchmarking the data that was diffused across arrays of information taken from different sectors, bodies and laboratory experiments became difficult to trace back to the original sources.Footnote 58 How was it then possible to integrate such computerized analysis in agriculture, but to keep these uses separate to each domain? As Bowker and Star claim, the classificatory application of these digital assessment tools can become interoperable in infrastructure: tools of this kind can be inverted, plugged into and standardized for uses elsewhere that are ‘not accidental, but are constitutive’, of each other.Footnote 59 On 4 November 2010, techniques for three-dimensional imaging illustrated this point in the sale of a consumer video game console – the X-Box and its controller, the Microsoft Kinect. Topping 10 million sales during the first three months after launch due to its global availability, a customizable software development kit or SDK was added for release in 2012. Scientific researchers in electronic engineering and robotics quickly began leveraging the three-dimensional imaging tool for ways to analyse, simulate and predict body movements using biometrics and machine learning.Footnote 60

Although it takes work, the perceived value of machine- and deep-learning models are the purported abilities to facilitate ‘a general transformation of qualities into quantities … [enabling] simple numerical comparisons between otherwise complex entities’.Footnote 61 In the sources of data and the statistical methods of machine learning, multiple practices bloomed in the domains of biometric identification, virtual reality, clinical imaging and border security. Yet the value of animals was their utility as unwitting data accomplices to imaging experiments that can train machine-learning processes without close ethical scrutiny. History foregrounds that such systems are not, in fact, uniformly applied, nor isolable in their uses or development.

If computer vision is trained on data from which to make new classifications and inferences beyond explicit programming, then what is gained or lost if techniques and technologies are transferred between different sectors and sites, and between humans and non-human animals? The next section examines the implications of these interconnections by focusing on computer vision experiments conducted in the past decade in New South Wales, Australia.

‘Extreme values’ on the farm

Australian government officials have offered guides on how to manually palpate an animal, and assess by hand proportions of anatomical fat, muscle and tissue depths, to then make a produce assessment on grades of composition and thus speculate on profitability.Footnote 62 Yet the graphical and anatomical knowledge of flesh, bones, sinew, skin and tissue as the primary determination for ‘economical cuts of meat’ has since transformed into digital visualization. Computer vision has transformed economic decision making taking place upon, or within, animals. Information historically assessed by hand or intimately by knife could now be performed remotely, largely unobtrusively, on creatures properly aligned to digital means. Indeed, early dissection techniques can be correlated to pictorial knowledges that multiple societies displayed in anatomical drawings of animal and human bodies. Represented by mathematical fragments, segments and curved geometries, these line drawings are considered ‘graphical primitives’ to the calculation of shapes and volumes that have since coalesced into ‘computational relations between them’.Footnote 63 As James Elkins claims, software developers that render three-dimensional computer graphics ‘have inherited pictorial versions of naturalism as amenable to computation’ and transferred to uses in scientific, medical and military sectors.Footnote 64

Contemporary animal identification commonly utilizes open-source software architectures and modular camera systems – this is in part due to the availability of pre-existing software, standardized training data and stabilized image labelling, translating into an ease of adoption.

In 2018, global food conglomerate Cargill invested in the Irish dairy farm start-up Cainthus (named after the corner of the eye) due to its computer vision software claiming to identify animals from distinctive facial recognition acts. According to Dave Hunt, Cainthus CEO, this precision management tool ‘observes on a meter-by-meter basis … there is no emotion, there is no hype … just good decisions and a maximization of productivity’.Footnote 65 These types of facial recognition system were also tested in Australia during 2019, by utilizing four cameras to simultaneously capture ‘on average 400 images per sheep’ as a proof of concept and as a data library for sheep identification.Footnote 66 The manufacturing of a sheep face classifier using Google's Inception V3 open-source architecture (trained on the ImageNet database; see Bruno Moreschi's contribution to this issue) illustrates how consumer-grade items, such as Logitech C920 cameras, can be quickly integrated into data analysis in modern farm environments. Yet this also changes the types of data that can drive machine learning analytical pipelines, in particular the use of statistical analysis tools.

Two years prior, in 2017, Australian robotics researchers from the University of Technology Sydney partnered with Meat and Livestock Australia using a purpose-built measurement ‘crush’ associated with cattle handling to accommodate two off-the-shelf RGB-D Microsoft Kinect cameras. Positioned to capture three-dimensional images of cattle while standing, their goal was to precisely digitize the fat and muscle composition in living Angus cows and steers.Footnote 67 The second step was to train the computer vision system to produce a robust predictive function between these – meat–fat marbling traits – and then to calculate an optimal yield from each beast.

In this experiment, the data model was intended to ‘learn’ a non-linear relationship between the surface curvatures of an animal, represented from within a Microsoft three-dimensional point cloud, and to translate that data into statistical values on the relative proportions of MS (muscle) to P8 (portion of fat) that could be representative in order to yield market prices.Footnote 68 To compare how their statistical model performed as an accurate machine-learning evaluation, the team compared results from human expert cattle assessors and those deduced from ultrasound detection. The proof of the concept aimed to demonstrate both the importance of ‘curvature shapes’ as a digital measure that could accurately verify product yield, and the feasibility of modelling such statistical ‘traits’ as assigned values for machine learning improving with each animal.

Yet in the precision that the experiment claimed to have reached there is a deeper phenomenon of scientific dislocation. Historians describe this as ‘the accumulation of evidence from converging investigations in which no single experiment is decisive or memorable in method or in execution’.Footnote 69 In this instance, curiously, the statistical and algorithmic data relationships deduced from animal shapes had first been trained on a 1988 data set from 180 anthropometric measurements taken on almost nine thousand soldiers in the US military.Footnote 70 Data collected at the United States Army Natick Research, Development and Engineering Center, under contract to Dr Claire C. Gordon, chief of the Anthropology Section, Human Factors Branch, used a method of software analysis engineered by Thomas D. Churchill, titled the X-Val or ‘extreme values’ computer program.Footnote 71 It is important to locate this military data here historically due to the mobility of its bankable metrics amenable to software innovation that then became adaptable and transformable, across species, continents and digital evaluations.

A secondary review of the 1988 Natick data set conducted in 1997 deduced ‘the undesirability of applying data from [such] non-disabled [and highly fit] populations’ to the design of equipment or techniques for bodies of other types.Footnote 72 And yet in this instance, the research institutions involved in the cattle crush experiment had utilized the data in 2012 to create a ‘shoulder-to-head signature’ to monitor people across office environments and then five years later applied these data sets to animals.Footnote 73 Arguably, due to the collaborative nature of the research groups, multiple institutions involved, and sectors of interest in robotics and video analytics, the knotting of methods used to digitize animals, to track people and to identify body parts were overlooked as unrelated or inconsequential. Yet the studies show that the team continued to use statistical shoulder-to-head signatures to reduce human beings to synthetic shapes classed as ‘ellipsoids’. Statistical data from a 2017 cattle crush experiment boosted this automated recognition system to count and track individuals on a city train station.Footnote 74

Comparing the mobility of data from the ‘cattle crush’ experiment to the ‘train station’ experiment, a set of relations can be drawn forward to show how modelling of humans and animals produces interoperable data sets for computer vision. In their reduction to metrics, statistics and ‘shapes’, a combination of visual cultures in scientific experimentation collides with odd epistemic transference between species. The slippage is a reduction of bodies to ‘curve signatures’ that permits cannibalizing data from otherwise unrelated sources. Yet no matter the efforts to reduce or to recontextualize such information, it always remains sensitive information, at least in potential abilities to foment and formalize a transfer into dual-use arenas.

The politics of such data entanglements remain under-studied. However in 2019 the Australian Strategic Policy Institute initiated a review into the University of Technology Sydney because of concerns that five experiments developing mapping algorithms for public security harboured potential China Electronics Technology Corporation products for use in Chinese surveillance systems. This resulted in suspending a public-security video experiment due to ‘concerns about potential future use’ and triggered a review into UTS policies.Footnote 75 Similarly, Australian entities involved in critical infrastructure projects, such as in the transport and agricultural sectors, are subject to laws requiring supply chain traceability to resist benefiting from exploited labour that may subsist in third-party data products.Footnote 76 These regulatory efforts can be extended to statistical and computer vision technology experiments on animals and in agriculture that may escape this type of notice. As this paper seeks to illustrate, collaborative ventures of scientific innovation create ‘a snarled nest of relations’ manufactured across data and supply chainsFootnote 77 – in this case, an appropriation of metrics that maximize military forces, surveil civilians or increase farm profits.

Conclusion: sectors that resist guardrails in AI

Lorraine Daston verified that ‘ancient languages record not epics like the Gilgamesh or the Iliad but rather merchants’ receipts: like five barrels of wine, twenty-two sheepskins, and so on’.Footnote 78 These early receipts and dissection offcuts informed the economic powers of ancient societies and built ‘ritualised numerical operations of ownership in tattooing, excising, incising, carving, or scarifying’.Footnote 79 In these identifiable bodily marking and sacrificial dissections – of humans and animals – is an early genealogy to computational AI strategies. Threading a course across the history of agriculture to that of isolated computing acts, the ‘dual-use dilemma’ reveals a high-stakes game of scientific research in the accrual of data for prototyping, and the creation of intellectual property situated in the complex interrelations of AI.Footnote 80

As Blanchette argues, ‘humans too are “tamed” in defining the technical processes of vertical integration, standardization, and monopolization for managing production into global scales for achieving efficiency and profitability’.Footnote 81 This pertains not just to horizontal and vertical divisions of labour, but to the computational forms of monitoring across species divides in which physical bodies are most visible, but ‘statistical surveillance’ is somewhat invisible.Footnote 82

König has called this manifold of military, agricultural and civilian techniques and data sets ‘the algorithmic leviathan’ in that it entwines the pioneering aspirations of 1980s artificial intelligence, from pattern recognition to the use of analogical logics in particular.Footnote 83 Although the nature of the statistical analysis and the computational technologies has changed, it is the infrastructural foundations that allow the tracking, segmenting and partitioning of all bodies. By considering the configurations between agriculture, data and technology in this way, this paper has highlighted the relevance of enviro-tech frameworks to perhaps assist historians and surveillance scholars with regulatory ambitions for increased transparency in AI systems. This is important amidst the dramatic transitions from an ‘over-the-skin’ type to an ‘under-the-skin’ type of surveillance on human beings that was ushered in due to COVID-19 – from a utility of infrared scanning, thermal imaging, drone monitoring and proximity detection techniques that stood in for close body contact and as anticipatory determinations for public security.Footnote 84

As Australian farms increase their scale to maintain profitability – with astonishing land use – autonomous technologies, such as unmanned aerial vehicles, fill gaps in monitoring. A transition to drone ‘shepherds’, robotic harvesters and high-tech surveillance applies to virtual fencing and ‘precision’ planting that enable statistical data to aid financial decision tools.Footnote 85 While a so-called ‘Fourth Agricultural Revolution’ is consequential to a nation's economic growth and food productivity, a diffusion of data across third parties and digital networks links credentials to sustainability and environmental management but may carry unanticipated consequences.Footnote 86 Farming is no longer family-run: it is big business, it is national security, and it is data-intensive. Here, too, Australian farmers face pressure from industrial organizations to transition to digital techniques. John Deere, for instance, has entered union disputes to resist a growing right-to-repair movement, showing that the tractor is not simply an earth-moving machine but, instead, a tool for data collection. Attempts to restrict access to intellectual software property and to prevent self-maintenance are ending up in the courts.Footnote 87 Farming has built on post-war aims to become ‘smart’ like cities, and yet it is interlaced with logistical tools suitable for the Internet of Things and the rise in the mobility of global data.Footnote 88

By tracing how computer vision developed in agriculture for animal biosecurity this paper indicates sets of technical and institutional links to AI systems, data, statistics and imaging. Yet these complex chains within experimentation can be vulnerable to bundling together such security measures, and its associated infrastructure, with the detection, surveillance, and containment of all manner of phenomena across scientific domains, manufacturing sectors, public spaces and for dual-use foundations.Footnote 89 What is critical is how what is being modelled is more deeply implicated in securing data that can be fungible for a range of ‘AI’ activities whether in the supply chain of nation-state surveillance or that foistered upon Amazon workers.Footnote 90 These acts threaten to grow and may remain unregulated by the kind of checks found in AI bills like the European Commission that is viewing real-time biometric identification as a high-risk scenario.Footnote 91 Underneath AI systems exist ‘warped and unclear geometries’ in the datasets and statistical models that can resist domain-specific guardrails, and, as argued elsewhere, intensive agriculture generates problems that cut across regulatory domains.Footnote 92 As an historically significant force behind burgeoning economies and infrastructural innovation agriculture and the use of animals assist machine-learning models not just to try and determine ‘what this is’, but to manufacture, assemble and to extract elements of nature into maximum yield. Yet as we near the limits of the harvest the data might also be cannibalised for other uses.

Acknowledgements

I am most grateful to Richard Staley for his wise editorial guidance, and to Matthew Jones, Stephanie Dick and Jonnie Penn for their generous, insightful advice on different versions. And to Director Kathryn Henne at RegNet - School of Global Governance and Regulation at the Australian National University for her continued support.

References

1 Goldman Sachs global investment research report, ‘Precision farming: cheating Malthus with digital agriculture’, in Equity Research: Profiles in Innovation, Goldman Sachs Group Inc., 2016, pp. 1–42, 9, at www.gspublishing.com/content/research/en/reports/2016/07/13/6e4fa167-c7ad-4faf-81de-bfc6acf6c81f.pdf (accessed 2 April 2022).

2 Gunaikurnai Land and Waters Aboriginal Corporation, at https://gunaikurnai.org/our-story (accessed 1 November 2021); footprints of cattle and soy industries continue to displace indigenous communities. See ‘The anthropologist Eduardo Viveiros de Castro discusses indigenous resistance in the Amazon with indigenous leader Raoni Metuktire, and his pessimism about the climate crisis’, Agência de Jornalismo Investigativo, 10 October 2019, at https://apublica.org/2019/10/werewitnessing-a-final-offensive-against-brazils-indigenous-people (accessed 1 November 2021).

3 Wilesmith, John W., Wells, G.A., Cranwell, Mick P. and Ryan, J.B., ‘Bovine spongiform encephalopathy: epidemiological studies’, Veterinary Record (1988) 123(25), pp. 638–44Google ScholarPubMed.

4 I tell this story to establish contact with animal lives in technological apparatus. See Benson, Etienne S., ‘Animal writes: historiography, disciplinarity, and the animal trace’, in Making Animal Meaning, East Lansing: Michigan State University Press, 2011, pp. 316Google Scholar.

5 On cattle crushes see Ewbank, Ray, ‘The behaviour of cattle in crushes’, Veterinary Record (1961) 73, pp. 853–56Google Scholar; for chute and handling see Grandin, Temple, ‘The design and construction of facilities for handling cattle’, Livestock Production Science (1997) 49(2), pp. 103–19, 429CrossRefGoogle Scholar; and for veterinary procedures see Grandin, , ‘Animal handling’, Veterinary Clinics of North America: Food Animal Practice (1987) 3(2), pp. 323–38Google ScholarPubMed.

6 Grandin, Temple, Thinking in Pictures: My Life with Autism, expanded edn, Knopf Doubleday Publishing Group, 2008, p.167Google Scholar – this also represents disability expertise at work; see Hartblay, Cassandra, ‘Disability expertise: claiming disability anthropology’, Current Anthropology (2020) 61, pp. S21, S26–S36CrossRefGoogle Scholar.

7 Cross, H.R., Durland, D.A. Gilliland, P.R. and Seideman, S., ‘Beef carcass evaluation by use of a video image analysis system’, Journal of Animal Science (1983), 57(4), pp. 908–17CrossRefGoogle Scholar.

8 Frost, A.R., Lines, C.P. Schofield, S.A. Beaulah, T.T. Mottram, J.A. and Wathes, C.M., ‘A review of livestock monitoring and the need for integrated systems’, Computers and Electronics in Agriculture (1997) 17(2), pp. 139–59CrossRefGoogle Scholar.

9 Tillett, Robin Deirdre, ‘Image analysis for agricultural processes: a review of potential opportunities’, Journal of Agricultural Engineering Research (1991) 50, pp. 247–58CrossRefGoogle Scholar.

10 Cocco, Luisanna, Tonelli, Roberto and Marchesi, Michele, ‘Blockchain and self-sovereign identity to support quality in the food supply chain’, Future Internet (2021) 13(301), pp. 119CrossRefGoogle Scholar.

11 B. Freedman, A. Shpunt, M. Machline and Y. Arieli, ‘Depth mapping using projected patterns’, US Patent 8493496 B2, 2012, date issued: 23 July 2013.

12 Malcom John McPhee, B.J. Walmsley, Bradley Skinner, B. Littler, J.P. Siddell, Linda Maree Cafe, Hutton Oddy and Alan Alempijevic, ‘Live animal assessments of rump fat and muscle score in Angus cows and steers using 3-dimensional imaging’, Journal of animal science (2017) 95(4), pp. 1847–57; also see Rural Industries Research and Development Corporation fact sheet ‘Transformative technologies’, National Rural Issues News. Australian Government, 2016 pp. 1–12, 5 – according to this fact sheet, ‘in combination with technologies along the supply chain, artificial intelligence directs autonomous machines to slice carcases to maximise yield’.

13 James Eyers, ‘Digital currency used in landmark cattle sale for rural economy’, Australian Financial Review, 19 July 2023, at www.afr.com/companies/financial-services/digital-currency-used-in-landmark-cattle-sale-for-rural-economy-20230719-p5dpe (accessed 1 August 2023).

14 Karin Knorr-Cetina and Klaus Amann, ‘Image dissection in natural scientific inquiry’, Science, Technology, & Human Values (1990) 15(3), pp. 259–83, 273.

15 Jean Blancou, ‘A history of the traceability of animals and animal products’, Revue scientifique et technique (International Office of Epizootics) (2001) 20(2), pp. 413–25; for radio tracking wild animals see Etienne Benson, Wired Wilderness: Technologies of Tracking and the Making of Modern Wildlife, Baltimore: Johns Hopkins University Press, 2010.

16 For industrialized slaughter processes see Timothy Pachirat, Every Twelve Seconds: Industrialized Slaughter and the Politics of Sight, New Haven, CT: Yale University Press, 2011; in relation to state power and empire building see Tiago Saraiva, Fascist Pigs: Technoscientific Organisms and the History of Fascism, Cambridge, MA: MIT Press, 2018; for early statistical data gathering in US agriculture see Emmanuel Didier, America by the Numbers: Quantification, Democracy, and the Birth of National Statistics (tr. Priya Vari Sen), Cambridge, MA: MIT Press, 2020.

17 John N. Sofos, ‘Challenges to meat safety in the 21st century’, Meat Science (2008) 78(1), pp. 3–13.

18 Nick Couldry and Ulises A. Mejias, The Costs of Connection: How Data Are Colonizing Human Life and Appropriating It for Capitalism, Oxford: Oxford University Press, 2020, p. 9.

19 Peter Linebaugh, ‘All the Atlantic mountains shook’, Labour/Le Travail (1982) 10, pp. 87–121, 92, added emphasis; for animals see Diego Rossello, ‘Book review: animal labour: a new frontier of interspecies justice?’, Perspectives on Politics (2021) 19(1), pp. 249–51; Henry Buller, ‘Individuation, the mass and farm animals’, Theory, Culture & Society (2013), 30(7–8), pp. 155-75.

20 Department of Primary Industries, NSW Government, Prime Fact Sheet 322, 3rd edn, Todd Andrews, at www.dpi.nsw.gov.au/animals-and-livestock/beef-cattle/appraisal/publications/live-cattle-assessment (accessed 14 October 2021).

21 Dani Valent, ‘Cows in pain, thirsty peaches, stressed tomatoes: how tech's helping nature talk to farmers’, Sydney Morning Herald, 5 June 2021, at www.smh.com.au/national/cows-in-pain-thirsty-peaches-stressed-tomatoes-how-tech-s-helping-nature-talk-to-farmers-20210507-p57pwh.html (accessed 10 August 2021).

22 RIRDC fact sheet, op. cit. (12), pp. 1–12.

23 Jacqueline Bos, Bernice Bovenkerk, Peter Feindt and Ynte van Dam, ‘The quantified animal: precision livestock farming and the ethical implications of objectification’, Food Ethics (2018) 2, pp. 77–92; Simon Coghlan and Christine Parker, ‘Harm to nonhuman animals from AI: a systematic account and framework’, Philosophy and Technology (2023) 36(25), pp. 1–34

24 Edmund Russell, Greyhound Nation: A Coevolutionary History of England, 1200–1900, Cambridge: Cambridge University Press, 2018.

25 Ann Norton Greene, Horses at Work: Harnessing Power in Industrial America, Cambridge, MA: Harvard University Press, 2008; Greene, ‘Technology and the environment in history’, Agricultural History (2021) 95(3), pp. 537–8.

26 Mark Finlay, ‘Far beyond tractors: envirotech and the intersections of technology, agriculture, and the environment’, Technology and Culture, (2010) 51(2), pp. 480–5.

27 Donna Jeanne Haraway, Primate Visions: Gender, Race, and Nature in the World of Modern Science, New York: Routledge, 1989.

28 Etienne Benson, ‘The cattle guard’, in the Multispecies Editing Collective, Transformations in Environment and Society (2017) 1, Troubling Species: Care and Belonging in a Relational World, pp. 49–55, 54.

29 Alex Blanchette, Porkopolis: American Animality, Standardized Life, and the Factory Farm, Durham, NC: Duke University Press, 2020.

30 Coghlan and Parker, op. cit. (23), p. 25; and for quantified objects see Bos et al., op. cit. (23).

31 Herbert A. Simon, The Sciences of the Artificial, reissue of the 3rd edn with a new introduction by John Laird, Cambridge, MA: MIT Press, 2019, p. 3.

32 Louise Amoore and Alexandra Hall, ‘Taking people apart: digitised dissection and the body at the border’, Environment and Planning D: Society and Space (2009) 27(3), pp. 444–64.

33 On institutional funding contracts see Anders Furze and Louisa Lim, ‘“Faustian bargain”: defence fears over Australian university's $100m China partnership’ The Guardian, 19 September 2017, at www.theguardian.com/australia-news/2017/sep/19/faustian-bargain-defence-fears-over-australian-universitys-100m-china-partnership (accessed 10 August 2021).

34 Rod Davis and Scott Janke, ‘Feedlot design and construction: cattle crushes’, Department of Agriculture, Fisheries and Forestry DAFF (2016), at www.mla.com.au/globalassets/mla-corporate/research- and-development/program-areas/feeding-finishing-and-nutrition/feedlot-design-manual/025-cattle-crushes- 2016_04_01.pdf.

35 John Forge, ‘A note on the definition of “dual use”’, Science and Engineering Ethics, (2010) 16, pp. 111–18, 111.

36 Kelly Bronson and Phoebe Sengers, ‘Big Tech meets Big Ag: diversifying epistemologies of data and power’, Science as Culture (2022), pp. 1–14, p. 15.

37 Mark Andrejevic and Zala Volčič, ‘“Smart” cameras and the operational enclosure’, Television & New Media (2021) 22(4), pp. 343–59, 345.

38 Amoore and Hall, op. cit. (32), 461.

39 Many were first published in journals run by the Institute of Electrical and Electronics Engineers (IEEE), or from 1985 in the journal Computers and Electronics in Agriculture.

40 Wim Rossing, ‘Animal identification: introduction and history’, Computers and Electronics in Agriculture (1999) 24(1–2), pp. 1–4.

41 P.G.F. Ploegaert, ‘Technical aspects of cow identification in combination with milk yield recording and concentrate feeding in and outside the milking parlour’, Proceedings of the Symposium on Cow Identification System and Their Applications (1976), IMAG, Wageningen, the Netherlands.

42 Rossing, op. cit. (40), p. 3.

43 Alfred R. Koelle, Steven W. Depp and Robert W. Freyman, ‘Short-range radio-telemetry for electronic identification, using modulated RF backscatter’, Proceedings of the IEEE (1975) 63(8), pp. 1260–1.

44 For a short history of RFID applications see Melanie R. Rieback, Bruno Crispo and Andrew S. Tanenbaum, ‘The evolution of RFID security’, IEEE Pervasive Computing (2006) 5(1), pp. 62–9.

45 Frost et al., op. cit. (8).

46 Kees-Jan van Dorp, ‘Beef labelling: the emergence of transparency’, Supply Chain Management: An International Journal (2003) 8(1), pp. 32–40.

47 J.F. Rockart, ‘Chief executives define their own data needs’, Harvard Business Review (1979) 57, pp. 81–3.

48 Pieter Hogewerf, ‘Current tools and technologies for the identification and traceability’, Wageningen UR Livestock Research Edelhertweg (2011) 15, p. 8219, at icar.org/wp-content/uploads/2015/12/Hogewerf.pdf.

49 Rossing, op. cit. (40), p. 3.

50 Michael Neary and Ann Yager, ‘Methods of livestock identification (No. AS-556-W)’, West Lafayette: Purdue University (2002), pp. 1–9, at www.extension.purdue.edu/extmedia/AS/AS-556-W.pdf (accessed 8 April 2022).

51 See William Earl Petersen, ‘The identification of the bovine by means of nose-prints’, Journal of Dairy Science (1922) 5(3), pp. 249–58; Ali Ismail Awad, ‘From classical methods to animal biometrics: a review on cattle identification and tracking’, Computers and Electronics in Agriculture (2016) 123, pp. 423–35. For animal facial recognition biometrics see Yue Lu, Xiaofu He, Ying Wen and Patrick Wang, ‘A new cow identification system based on iris analysis and recognition’, International Journal of Biometrics (2014) 6(1), pp. 18–32; and Rony Geers, ‘Electronic monitoring of farm animals: a review of research and development requirements and expected benefits’, Computers and Electronics in Agriculture (1994) 10(1), pp. 1–9.

52 Rossing, op. cit. (40).

53 Jonathan H. Connell and Michael Brady, ‘Learning shape descriptions’, Proceedings of the 9th International Joint Conference on Artificial Intelligence (1985) 2, pp. 922–5; for a 1987 MIT report on learning shape models from a version of Winston's ANALOGY program see Connell and Brady, ‘Generating and generalizing models of visual objects’, Artificial intelligence (1987) 31(2), pp. 159–83; Linda G. Shapiro, ‘A structural model of shape’, IEEE Transactions on Pattern Analysis and Machine Intelligence (1980) 2, pp. 111–26.

54 J.A. Marchant and C.P. Schofield, ‘Extending the snake image processing algorithm for outlining pigs in scenes’, Computers and Electronics in Agriculture (1993) 8, pp. 261–75.

55 Frost et al., op. cit. (8), p. 150.

56 This instrumental need was identified in 1978, when the US General Accounting Office reported to the US Congress that the United States Department of Agriculture (USDA) must ‘increase research efforts to develop instruments to accurately measure beef carcass characteristics’. See Dale R. Woerner and Keith E. Belk, ‘The history of instrument assessment of beef’, prepared for the National Cattlemen's Beef Association (2008), pp. 1–18, 1, at beefresearch.org/Media/BeefResearch/Docs/the_history_of_instrument_assessment_of_beef_08-20-2020-93.pdf (accessed 28 April 2022).

57 H.R. Cross, D.A. Gilliland, P.R. Durland and S. Seideman, ‘Beef carcass evaluation by use of a video image analysis system’, Journal of Animal Science (1983) 57(4), pp. 908–17; and G. Purnell and K. Khodabandehloo, ‘Vision for robot guidance in automated butchery’, in Spyros G. Tzafestas (ed.), Robotic Systems: Advanced Techniques and Applications, Dordrecht: Springer Netherlands, 1992, pp. 619–26.

58 Geers, op. cit. (51).

59 Geoffrey Bowker and Susan Leigh Star, Sorting Things Out: Classification and Its Consequences. Inside Technology, Cambridge, MA: MIT Press, 1999, ‘To classify is human’, p. 36.

60 Microsoft calls this the Kinect effect, as on 1 February 2012 they released the Kinect Software Development Kit (SDK) that enabled customization between humans and machines with three-dimensional depth sense programmability. See Zhengyou Zhang, ‘Microsoft Kinect sensor and its effect,’ IEEE MultiMedia (2012) 19(2), pp. 4–10.

61 Michael Castelle, ‘Deep learning as epistemic ensemble’, castelle.org, 2018.

62 Todd Andrews, Department of Primary Industries, NSW Government, Prime Fact Sheet 322, 3rd edn, at https://www.dpi.nsw.gov.au/animals-and-livestock/beef-cattle/appraisal/publications/live-cattle-assessment (accessed 14 October 2021).

63 Lev Manovich, ‘What is visualisation?’, Visual Studies (2011) 26(1), pp. 36–49, 36.

64 James Elkins, ‘Art history and images that are not art’, Art Bulletin (1995) 77(4), pp. 553–71, 557.

65 ‘The future of farming with cow facial recognition’, IDTechEx, 6 February 2018, at www.onartificialintelligence.com/articles/13706/the-future-of-farming-with-cow-facial-recognition (accessed 14 August 2019).

66 Herman Raadsma, Ian Harris, Dacheng Tao, Mehar Khatkar, Junbin Gao, Sarah Thompson, Will Gibson and Mark Ferguson, ‘Artificial intelligence in wool production’, Final Report for Australian Wool Innovation Limited, 2019, pp. 1–44.

67 McPhee et al., op. cit. (12).

68 McPhee et al., op. cit. (12), p. 1856.

69 From nineteenth-century Helmholtz experiments on biological quantification and graphical statistic curves investigated by Frederic L. Holmes and Kathryn M. Olesko, ‘The images of precision: Helmholtz and the graphical method in physiology’, in M. Norton Wise (ed.), The Values of Precision, Princeton, NJ: Princeton University Press, 1995, pp. 198–221.

70 C.C. Gordon, B. Bradtmiller, C.E. Clauser, T. Churchill, J.T. McConville, I. Tebbetts and R.A. Walker, technical report (TR−89‒044), ‘1988 anthropometric survey of U.S. Army personnel: methods and summary statistics’, US Army Natick Research, Development, and Engineering Center, MA, 1989.

71 Ted D. Churchill, B. Bradtmiller and Claire C. Gordon, ‘Computer software used in US Army anthropometric survey 1987–1988’, Anthropology Research Project Inc. (1988), Yellow Springs, Ohio, at https://apps.dtic.mil/sti/pdfs/ADA201185.pdf (accessed 13 April 2022).

72 Although over two hundred dimensions were measured on 9,000 ethnically diverse soldiers, the data from this survey lacked the range of variability found in populations, like people with disabilities – see www.corada.com/documents/anthropometry-for-persons-with-disabilities/whole-document (for quote). Also see the problems of ‘shape-to-value’ assessments from race and disability studies: Simone Browne, ‘Digital epidermalization: race, identity and biometrics’, Critical Sociology (2010) 36(1), pp. 131–50, 134.

73 Nathan Kirchner, Alen Alempijevic and Alexander Virgona, ‘Head-to-shoulder signature for person recognition’, IEEE International Conference on Robotics and Automation, 2012, pp. 1226–31; Nathan Kirchner, Alan Alempijevic, Alexander Virgona, Xiaohe Dai, Paul G. Plöger and Ravi K. Venkat, ‘A robust people detection, tracking, and counting system’, Australasian Conference on Robotics and Automation, ACRA, 2014, pp. 1–8.

74 Ellipsoids are visible in the video materials provided in the RM-CRC Project 3.1.2 conducted by Alexander Virgona, Alan Alempijevic and T. Vidal-Calleja, ‘Socially constrained tracking in crowded environments using shoulder pose estimates’, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 4555–62. Supported by the Rail Manufacturing CRC (RMCRC) and the Downer EDI Rail Pty Ltd.

75 On 1 May 2019, a media enquiry from the Financial Times raised concerns about UTS's relationship with the China Electronics Technology Corporation (CETC) based on an embargoed Human Rights Watch (HRW) report titled ‘China's algorithms of repression: reverse engineering a Xinjiang police mass surveillance app’, at www.uts.edu.au/news/media-contacts/uts-cetc-review (accessed 26 March 2022).

76 Vicky Xiuzhong Xu, Danielle Cave, James Leibold, Kelsey Munro and Nathan Ruser, ‘Uyghurs for sale: “re-education”, forced labour and surveillance beyond Xinjiang’, ASPI Australian Strategic Policy Institute, February 2020, at www.aspi.org.au/report/uyghurs-sale; and Australian government, Federal Register of Legislation, The Modern Slavery Act 2018, at www.legislation.gov.au/Details/C2018A00153 (accessed 1 August 2023).

77 The use of animals implicit to the creation of intellectual property and patentable biotechnology remains unimpeded by the lack of legal rights afforded to animals. See L. Cressida, ‘Inventing animals’, in Y. Otomo and E. Mussawir (eds.), Law and the Question of the Animal: A Critical Jurisprudence, Durham, NC: Duke University Press 2013, pp. 1–208, 56.

78 Lorraine Daston, ‘Calculation and the division of labor, 1750–1950’, Bulletin of the German Historical Institute (2018) 62, pp. 9–30, 9.

79 Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia, vol. 1, New York: Viking Press 1977, p. 144.

80 See Lilly Irani, ‘The cultural work of microwork’, New Media & Society (2015) 17(5), pp. 720–39, 724; and Mark Graham, ‘The rise of the planetary labour market – and what it means for the future of work’, Technosphere Magazine, 2018, Trust Dossier, at www.anthropocene-curriculum.org/contribution/the-rise-of-the-planetary-labor-market-and-what-it-means-for-the-future-of-work.

81 ‘Thinking through human–animal labor: an interview with Alex Blanchette by Mariko Yoshida’, in More-Than-Human (2020) 4, at https://ekrits.jp/en/2020/10/3879 (accessed 1 November 2021).

82 ‘Panopticons and leviathans: Oscar H. Gandy, Jr. on algorithmic life (a conversation about life and death in a computerized world)’, Logic Magazine (2020) 12, Commons, at https://logicmag.io/commons/panopticons-and-leviathans-oscar-h-gandy-jr-on-algorithmic-life (accessed 24 March 2022).

83 Pascal D. König, ‘Dissecting the algorithmic leviathan: on the socio-political anatomy of algorithmic governance’, Philosophy & Technology (2020) 33(3), pp. 467–85.

84 Karamjit S. Gill, ‘Prediction paradigm: the human price of instrumentalism’, AI & Society (2020) 35(3), pp. 509–17; for health security see Stephen L. Roberts and Stefan Elbe, ‘Catching the flu: syndromic surveillance, algorithmic governmentality and global health security’, Security Dialogue (2017) 48(1), pp. 46–62; for drones see Lisa Parks, ‘Drones, infrared imagery, and body heat’, International Journal of Communication (2014) 8, pp. 2518–21; for surveillance see the US Department of Health and Human Services Food and Drug Administration, Center for Devices and Radiological Health (CDRH) and Office of Product Evaluation and Quality (OPEQ), ‘Enforcement policy for tele-thermographic systems during the coronavirus disease 2019 (COVID-19) public health emergency’, at www.fda.gov/media/137079/download.

85 Kate Yaxley, Keith Joiner and Hussein Abbass, ‘Drone approach parameters leading to lower stress sheep flocking and movement: sky shepherding’, Scientific Reports (2021) 11, 7803.

86 Barrett, Hannah and Rose, David Christian, ‘Perceptions of the fourth agricultural revolution: what's in, what's out, and what consequences are anticipated?’, Sociologia Ruralis (2022) 62(2), pp. 162–89CrossRefGoogle Scholar.

87 The Right to Repair Movement is a coalition of actors aiming to create legal and regulatory protections for the ability to modify farm equipment using software against companies that require services at high cost. See Laura Sydell, ‘DIY tractor repair runs afoul of copyright law’, NPR, 17 August 2015, at www.npr.org/sections/alltechconsidered/2015/08/17/432601480/diy-tractor-repair-runsafoul-of-copyright-law (accessed 1 July 2018).

88 Rose, David Christian and Chilvers, Jason, ‘Agriculture 4.0: broadening responsible innovation in an era of smart farming’, Frontiers in Sustainable Food Systems (2018) 2(87), pp. 17CrossRefGoogle Scholar.

89 With thanks to Richard Staley. See Sharp, Lesley, ‘Perils before swine’, in Chen, Nancy and Sharp, Lesley (eds.), Bioinsecurity and Vulnerability, Santa Fe: School for Advanced Research Press, 2014, pp. 4564, 61Google Scholar.

90 See Bell, Sam Adler, ‘Surviving Amazon,’ Logic (2019) 8Google Scholar, at https://logicmag.io/bodies/surviving-amazon (accessed 10 August 2021).

91 European Commission, ‘Proposal for a regulation of the European Parliament and of the Council – laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union Legislative Acts’, AI Act Website (2021) (accessed 17 April 2023).

92 Amaro, Ramon, ‘Artificial intelligence: warped, colorful forms and their unclear geometries’, in Io, Danae and Copley, Callum (eds.), Schemas of Uncertainty: Soothsayers and Soft AI, Amsterdam: PUB/Sandberg Instituut, 2019, pp. 6990Google Scholar; and Parker, Christine, Haines, Fiona and Boehm, Laura, ‘The promise of ecological regulation: the case of intensive meat’, Jurimetrics (2018), 59, pp. 115Google Scholar.

Figure 0

Figure 1. A layout for corralling livestock in Australia and South America, including a squeeze or ‘cattle crush’ as part of the electronic sorting and computerized systems. Reprinted from Temple Grandin, ‘The design and construction of facilities for handling cattle’, Livestock Production Science (1997) 49(2), pp. 103–19, 112, with permission from Elsevier.