In April 2022, a team of journalists at the tech website 9to5Mac discovered that the Apple iPhone Operating System (iOS) 15.5 beta update was blocking photographs taken at sites related to the Holocaust from appearing in the Memories feature of the Photos app. By adjusting the automated system to exclude photographs tagged with geographic coordinates from 12 designated ‘sensitive locations’, this measure aimed to reduce the likelihood of ‘unwanted memories’ appearing in the collections and sequences that the feature curates from users’ personal photo libraries (Espósito Reference Espósito2022). Apple has been using algorithmically driven artificial intelligence (AI) to scan users’ personal photographs for the automated creation of photo albums and slideshows since the Memories feature launched in 2016, but the decision to omit content related to this specific history reignites major concerns surrounding the representation and memory of the Holocaust in the digital era. This article examines Apple's decision to block ‘sensitive locations’ from the Memories feature as a significant development in the transition from digital to algorithmic memory and as a case study for tracing the growing public conception of automated memory technologies.
The initial list of ‘sensitive locations’ consisted of 12 sites which are all notable for their connection to the Holocaust: the Yad Vashem Memorial, the Dachau Concentration Camp, the US Holocaust Museum, the Majdanek Concentration Camp, the Berlin Holocaust Memorial, the Schindler Factory, the Belzec Extermination Camp, the Anne Frank House, the Sobibor Extermination Camp, the Treblinka Extermination Camp, the Chelmno-Kulmhof Extermination Camp, and the Auschwitz-Birkenau Concentration Camp (Espósito Reference Espósito2022). The implementation of this measure raises a distinctly contemporary set of concerns regarding the remembrance of the Holocaust in the era of the algorithm while calling public attention to the potential stakes of automated technologies like Apple Memories in framing representations of the past. Automated memory technologies now exert a tremendous influence on how users interact with personal photographs and histories, but they have often drawn criticism for how they conceive of and perform the processes of remembering (Fontcuberta Reference Fontcuberta and Fontcuberta2015; Lee Reference Lee2020; Van Dijck Reference Van Dijck2007). By examining the response to the ‘sensitive locations’ block, this article addresses the following questions: How does Apple's decision to block ‘sensitive locations’ from the Memories feature impact the contemporary discourse surrounding the digital and algorithmic remembrance of the Holocaust? And what does the public response to this decision reveal about the growing public conception of automated memory technologies, their cultural influence, and their technical operation? This analysis identifies three key areas of public concern: the appropriate boundaries of platform intervention into individual and collective memories, the subjective value and function of personal photographs, and the questionable metrics through which algorithmic technologies target their outcomes. Though opinion in all of these areas is divided, the response to the ‘sensitive locations’ block provides insight into the rising public consciousness of automated memory technologies and their significance in the evolving personal and collective memory practices of the 21st century.
Between memory and history
To situate the ‘sensitive locations’ block within the evolving discourse surrounding the digital memory of the Holocaust, this research draws from literature on the relationship between memory, place, and mobile media as well as the complex boundaries between individual and collective memory. As memories often become attached to certain places, this intertwined relationship has increasingly factored into the design of many mobile media platforms and services which request or even require locational information for their functionality (Özkul and Humphreys Reference Özkul and Humphreys2015). From a memory-making perspective, the significance of a given place can range from the entirely personal to the nearly universal and everywhere in between. Some places are widely recognised for the significant events that occurred on (or in) them whereas others are deliberately created to house memories such as archives, museums, or monuments (Nora Reference Nora1989). Each of these types of places are represented in Apple's list of ‘sensitive locations’, with some sites notable for their direct connection to the Holocaust, eg, the Auschwitz-Birkenau Concentration Camp, and others for the designated purpose of Holocaust remembrance, eg, the Berlin Holocaust Memorial (also known as the Memorial to the Murdered Jews of Europe). Regardless of the processes through which each of these 12 sites gained their significance, they have all become widely recognised as important locations for commemorating and preserving the collective memory of the Holocaust. But with an estimated 1.46 billion active iPhone users worldwide (Shewale Reference Shewale2023), it is difficult to imagine what other places might one day be deemed worthy of blocking from the automated memories of a global userbase so vast it can hardly be spoken of as a collective.
The title of this article refers to historian Pierre Nora's widely influential essay Between Memory and History: Les Lieux de Mémoire (1989) which laments the ‘collapse’ of memory as signalled by increased mediatisation and the proliferation of mass culture. Nora (Reference Nora1989) draws the distinction between ‘real’ memory and history, arguing that ‘real’ memory is a perpetually actual phenomenon, ‘it remains in permanent evolution, open to the dialectic of remembering and forgetting, unconscious of its successive deformations, vulnerable to manipulation and appropriation, susceptible to being long dormant and periodically revived’ (p. 8). Conversely, history is described as a representation of the past that ‘belongs to everyone and to no one’, a problematic and incomplete reconstruction dependent on traces, media, and critical distance (Nora Reference Nora1989, pp. 8–9). Nora (Reference Nora1989) goes on to argue that in our ‘hopelessly forgetful’ modern society, memory more closely resembles history, and is merely a way to organise the past through ‘sifted and sorted’ traces (p. 8). If our memory today consists largely of these media traces, it is important to consider how popular technologies like Apple Memories approach the acts of sifting and sorting and what the potential consequences of their use may be.
The proliferation of mobile devices has contributed to a dramatic increase in the creation of personal media and has transformed how people use media to interact with important histories and collective memories. These changes have also had an impact on the remembrance of the Holocaust (Shandler Reference Shandler2017, Walden Reference Walden2019), and the reshaping of these practices may be accelerating as the last of the remaining witnesses pass away (Feldman and Musih Reference Feldman and Musih2023). These transformations can be understood as part of the ‘connective turn’ in media and memory studies (Hoskins Reference Hoskins2011), which refers to ‘the emergent set of tensions and transitions from a ‘scarcity’ to a ‘post-scarcity’ culture availed through the abundance, pervasiveness and accessibility of communication networks, nodes, and digital media content’ (p. 20). The relationship between individual and collective memory has been variously reconceptualised following the connective turn, leading to a renewed interest in the expanded range of environments where the two converge (Hoskins Reference Hoskins2016). As one such environment, the stakes of Apple Memories’ presence at the junction between the individual and the collective have been amplified by the gravity of the history addressed by the ‘sensitive locations’ update.
Collective memory is a concept advanced by Maurice Halbwachs in Les Cadres Sociaux de la Mémoire (Reference Halbwachs1992) to refer to the shared understandings, narratives, and values of a social group. In order to interrogate how the public and/or official narratives of the past affect the memory of the individual, and vice versa, Cordonnier et al. (Reference Cordonnier, Rosoux, Gijs and Luminet2022) have proposed the metaphor of the hourglass. The invertible hourglass sifts between the broad historical, political, and social contexts associated with collective memory on a macro scale and the more private or intimate contexts of individual memory which occur on a micro scale. The narrow passage between the macro and the micro spheres is described as a ‘meso’ scale which filters between the two. Following Halbwachs, Cordonnier et al. (Reference Cordonnier, Rosoux, Gijs and Luminet2022) position the small social group of the family at this crucial junction, but in acting as a filter for sifting and sorting personal photographs (or omitting aspects of a specific history) one can see how technologies like Apple Memories can fulfil a similar function. While family and other close social groups still contribute to the reconstruction of individual memories, the formation of personal histories also increasingly occurs through interactions with personal media framed through automated technologies.
Nora's distinction between memory and history has only grown more complicated as many practices of remembrance have evolved alongside the rapid advancements in media technology. French philosopher Bernard Stiegler (Reference Stiegler2009; Reference Stiegler2011) has referred to these technologies of remembering as mnemotechnologies, and their rise is described as contributing to the industrialisation of memory. The widespread use of such technologies has also contributed to what Nora (Reference Nora1989) describes as the acceleration of history, which views the increased dependence on documentation as signalling a decline in human memory. As an automated system for sorting through the unprecedented abundance of personal photographs many users now accumulate, the Memories feature is emblematic of the significant changes occurring in mnemotechnologies as part of the ongoing transition from digital memory to algorithmic memory. This transition is marked by the growing need to organise an expansive volume of digital content about the past and the delegation of this task to algorithmically powered information retrieval systems (Makhortykh Reference Makhortykh2021). Observing how this transition impacts the memory of mass atrocities, Makhortykh (Reference Makhortykh2021) suggests, ‘commercial retrieval systems usually do not consider their role in shaping memory, resulting in gaps between their intended use and actual functionality as a mnemotechnology’ (p. 181). Unlike YouTube's recommendation system or Google's search results however, Apple Memories is explicitly concerned with shaping the memories of its users, which leads to its unique position as an intermediary between the memories of the individual on one hand and the collective on the other.
On memories, algorithms, and imaginaries
Apple Memories and its competitors approach the issues surrounding photographic abundance and organisation by automatically curating content from users’ photo libraries. By associating and re-associating images from different places and times to generate new sequences and juxtapositions, unlikely connections and even new meanings can be created (Van House Reference Van House2011, p. 129). Finn (Reference Finn2017) notes that the increased storage capacity and network capabilities of modern technology seemingly allow users to remember more than ever before, but this possibility ‘alters our capacity to perform certain kinds of biological recollection by encouraging us to focus on how we might access information rather than the information itself’ (p. 37). As algorithms now play an elevated role in shaping how users interact with their photographs, their capacity to influence the formation of memories will likely impact the processes through which personal narratives, histories, and identities are formed. Jacobsen (Reference Jacobsen2022) has described this process as facilitating ‘algorithmic emplotment’, which ‘contributes to our understanding of the social power of algorithms as it helps to scrutinise the way in which people's lives are rendered sequential, ordered, and ultimately meaningful and actionable by algorithmic processes’ (p. 1083). Yet even as algorithms increasingly factor into the content users encounter in the Photos app, how Apple Memories is designed to interpret and act upon photographs remains largely mysterious to many of its users.
When Memories was introduced at the Worldwide Developers Conference (WWDC) in 2016, Apple described a technology that would perform ‘11 billion computations per photo’ and which ‘automatically creates curated collections of your most meaningful photos and videos’ (Federighi Reference Federighi2016). As advanced computer vision and machine learning were presented as the means through which these lofty claims could be achieved, the agency of the user in determining what content to resurface from their photo library was reimagined through the logic of the algorithm. The significance of this shift cannot be understated as it dramatically reconfigures the premise upon which users navigate their photographs and come to form an understanding of their pasts. With the Apple Memories algorithms at its core, the Photos app library was no longer to be fixed in a linear sequence but rather envisioned as a dynamic archive to be operationalised through relational data practices. This heralded a dramatic shift for Apple, away from technologies aimed at helping users better organise their own photographs and towards algorithmic technologies that could automate these processes (Pereira Reference Pereira2022). Noting how certain memory practices are not easily translated to digital platforms, Van Dijck (Reference Van Dijck2007) suggests that, ‘Software engineers increasingly begin to realise that the design of picture management systems requires a profound understanding of why and how users interact with their pictures’ (p. 110). The prospective profundity of algorithmic systems for making judgements traditionally reserved for humans is particularly concerning when it comes to the representation of histories as significant as the Holocaust.
Apple Memories’ reimagining of the archive is effectively understood through Mackenzie and Munster's (Reference Mackenzie and Munster2019) concept of ‘platform seeing’ which refers to the ways in which images are increasingly quantified, labelled, formatted, and otherwise made ‘platform-ready’ in order to be subsequently understood as part of larger image ensembles. Such image ensembles transcend the comparably limited perceptual scope of individuals, and their operativity is distributed across a variety of devices, human agents, algorithms, and artificial intelligence (Mackenzie and Munster Reference Mackenzie and Munster2019, p. 5). In systems such as these, photographs are not simply there to be seen by human observers, but are automatically categorised, organised, operationalised, and even assigned a respective value in relation to other photos in the users’ library. These systems may not always succeed in curating the exact content their users desire, but the relative convenience of automation may allay many of these concerns.
The abstraction of memory into calculable metrics is a crucial component of automated memory making systems, but the quantification of such ethereal processes can restrict the free circulation of memory in individual and collective experience (Jacobsen and Beer Reference Jacobsen and Beer2021). Attempting to mimic the affective qualities of memory through forms of metricisation has been referred to by Jacobsen and Beer (Reference Jacobsen and Beer2021) as ‘quantified nostalgia’, which reimagines acts of remembrance through the emerging logics and capabilities of algorithmic media technologies. Though there is scepticism surrounding the merits of such quantification, algorithms often possess a somewhat seductive quality that leads users to accept their presence in cultural spaces, even when they produce results that do not quite fit expectations (Finn Reference Finn2017, pp. 49–50). It is therefore often only when the results fall well outside of or even betray these expectations that more serious scrutiny is given to the presence of algorithms in aspects of daily life. Moments of heightened algorithmic awareness are significant insomuch as they either shape or threaten the collective cultural imaginary that surrounds these mysterious systems.
In observing the ways knowledge of the self is increasingly produced through a distributed network of mass-produced autonomous devices, Hong (Reference Hong2020) has pointed to the imperative responsibility placed upon users to understand how these technologies transform aspects of human agency. When the futuristic fantasies and imaginaries of human augmentation dominate much of the contemporary technological discourse, imperfect machines are often given a credibility which vastly exceeds their present capabilities (Hong Reference Hong2020). This powerful grip on the public imagination helps many technologies endure moments of rupture – ‘Even as they malfunction and disappoint, they help drag impossible functions and nonexistent relations into the realm of the sayable and thinkable’ (Hong Reference Hong2020, p. 14). Understanding the imaginaries that are built around Apple Memories provides a useful foundation upon which to critically interrogate how the technology is meant to be conceived by its userbase and how this may differ from its actual perception.
The algorithmic imaginary (Bucher Reference Bucher2017) and the sociotechnical imaginary (Jasanoff Reference Jasanoff, Jasanoff and Kim2015) provide conceptual frameworks through which users’ perception of Apple Memories can be better understood. The ‘algorithmic imaginary’ is a notion developed by Bucher (Reference Bucher2017) to describe ‘the way in which people imagine, perceive and experience algorithms and what these imaginations make possible’ (p. 31). Whereas Bucher (Reference Bucher2017) is focused primarily on personal stories and affective encounters with algorithms, the collective cultural perception of algorithms as grown through media coverage and commentary is also a significant factor in shaping the imaginary that surrounds algorithmic technologies. This discourse can also be viewed through Jasanoff's (Reference Jasanoff, Jasanoff and Kim2015) notion of the ‘sociotechnical imaginary’, a term that refers to the collective belief that advancements in science and technology will deliver a desirable future based on a shared understanding of social values (p. 4). When looking specifically at Apple Memories and the significance of the ‘sensitive locations’ block, it is clear that a larger portion of the user base will be made aware of this update through the media coverage it receives rather than their own unique experience of it. As such, both the algorithmic and the sociotechnical imaginary are impacted by this development – users may begin to question the consequence of algorithms in structuring and selecting their own photos and memories as well as how their personal experience may connect to larger concerns regarding how new technologies will shape society and culture moving forward.
Methodology
When Apple released the iOS 15.5 beta 3 update to developers and public beta testers in April 2022, the analysts at 9to5Mac were the first to report on the ‘sensitive locations’ block. 9to5Mac is a website dedicated to breaking news and information about all things Apple and has been cited by publications including the New York Times, the Washington Post, the Wall Street Journal, the Financial Times, and Bloomberg (9to5Mac 2023). As of October 2023, 9to5Mac averages between 4.3 and 5.3 million monthly visitors, with the largest age group demographic (33.94%) between the ages of 25 and 34 (Similarweb 2023). In a brief post by tech journalist Filipe Espósito on April 26th 2022, news of Apple's decision to block specific locations from the Memories feature in the Photos app was revealed to this readership, quickly picked up by a range of similar tech websites, and eventually by a few more mainstream media outlets including Forbes (Monckton Reference Monckton2022) and The Sun (Edwards Reference Edwards2022). Websites like 9to5Mac are popular within the tech community and act as a crucial intermediary between the official communications of tech companies and the general public. They often distil the important aspects of product developments and software updates into accessible reportage, appealing to readers with varying degrees of technical knowledge. Though this coverage did not find tremendous purchase in the public discourse surrounding algorithms or collective memory, it is significant insomuch as it represents a rare moment of transparency into how the Apple Memories algorithm acts upon users’ photographs and how this technology might impact the remembrance of important histories. Though far from exhaustive, the comment section of the original post offers insight into how the ‘sensitive locations’ update was received by an engaged segment of the user base. It is a unique site of exchange that illustrates a range of concerns, questions, observations, and opinions emerging in public discourse.
Because Apple made no announcement concerning the introduction of ‘sensitive locations’ to the Photos app, it is only thanks to the tech savvy journalists at 9to5Mac that awareness of this update was brought to public attention. This was achieved through analysis of the changes to the code in the Memories feature, which requires developer access and a strong degree of familiarity with the inner workings of the iOS. Though this rare combination of technical knowledge and investigative journalism should be lauded at a time when the gaps between computer scientists, social scientists, and the general public remains vast (Seaver Reference Seaver2017), it underscores the importance of journalists for identifying technological developments and rendering them in language that is comprehensible to a more general public. In order to evaluate how the media reportage and comment section covering the ‘sensitive locations’ block contributes to the rising public consciousness of automated memory technologies, this article employs critical discourse analysis as a method for examining how this news was received by this particular segment of the iPhone userbase and what this reception suggests about the perception of the Memories technology as a whole. To effectively map these layers of meaning, this analysis draws from Fairclough's (Reference Fairclough1995) three-dimensional analytical framework, examining the written text and comment section as well as the practices of production, distribution, and consumption in which they emerged, and the sociocultural practices contributing to the ‘sensitive locations’ block as a discursive event.
As of October 2023, 61 comments appear under the 9to5Mac article which originally reported this news – while many commenters used the opportunity to critically reflect on the potential implications of this development, many others engaged in argumentation and jokes. Because comments for this thread have now been disabled, the comment section represents a preserved encapsulation of the immediate reactions, opinions, and exchanges of this particularly active subset of the iPhone userbase, effectively capturing a unique moment in the rising public consciousness of automated memory technologies and their impact on the memories of their users. The data used for this analysis was posted on a public forum where commentors are able to use pseudonyms appropriate to their desired degree of publicity. All usernames appear in this article as they do in the publicly accessible comment section and include no additional personal identifiers.
The following sections look at three interrelated areas of concern that emerged across the comment section of the original 9to5Mac post and situate them within the broader discourse concerning the commemoration of the Holocaust and the impact of algorithmic technologies on memory. The first section examines the unique aspects of the ‘sensitive locations’ block in the evolving context of Holocaust remembrance with an emphasis on the increasingly blurry boundaries between individual and collective memory and the impact of the transition from digital to algorithmic memory. The second section broadens the discussion surrounding potentially sensitive content and the subjective cultural processes by which personal memories become attached to photographs, underscoring the variations and nuances that prove difficult for algorithmic processing. The third section goes on to problematise the quantification of memory for algorithmic systems, troubling the notion that there is an appropriate metric for curating memories across the diverse sensibilities of the Apple Memories userbase.
The digital and the algorithmic memory of the Holocaust
As long as this is able to be toggled - I don't want someone else deciding what is ‘sensitive’ … and perhaps I WANT a memory of a particularly moving place to keep me grounded. They don't all have to be ‘happy’ or ‘celebratory’ memories. Let's not sanitise the world and forget our past or we risk repeating our mistakes. Maybe use the location to select more appropriate, moving music for the memory with this feature … or alternatively, put them into their own category entirely, slightly hidden, rather than shoving it on the Home Screen. Not all memories are happy, nor should we expect them to be. That's an unhealthy way to live. -Paul Martin
Couldn't agree more. I might add you should be able to put the memories in the category you suggest, slightly hidden, yourself. Might be a loved one who passed away, or a relationship no longer standing (but maybe you had pics with kids you might want to keep). But let people be in charge of their memories. I, for one, have visited Dachau and have shot pictures. I love them, for the power they have, the message they convey (never again) and it doesn't bother me the slightest when they appear. Quite the opposite, they serve as a powerful reminder. -il_teo77 (2022)
As the exchange above demonstrates, many commenters on the post were compelled to express their opinions but also contextually situate and elaborate upon both the historical and personal implications of this development. Paul Martin's commentary in particular reveals a surprising degree of contemplation about how lives and memories ought to be experienced and the pitfalls of technologies that disregard or misrecognise the importance of user agency in these practices. Another commentor on the original 9to5Mac post was quick to note that the update was released the day before Yom HaShoah, the Holocaust Remembrance Day in Israel which is held on the 27th of Nisan in the Hebrew calendar (either April or May in Gregorian calendar) (5723alex 2022). One can only speculate as to Apple's awareness of this important day of remembrance as the company made no official announcement about the update, but such timing would be quite a coincidence. This significant aspect of the story is absent in the original 9to5Mac reportage, which refers to the block as an ‘interesting tweak’, before adding that ‘Interestingly, all the places banned in this version are related to the Holocaust’ (Espósito Reference Espósito2022). Though the decision to only block locations specifically connected to the Holocaust is certainly interesting, these neutral descriptions circumvent the potentially controversial or contentious angles of this overtly sociopolitical move.
As Feldman and Musih (Reference Feldman and Musih2023) note, the remembrance of the Holocaust in public discourse often reflects changes in the evolving social, political, and linguistic factors that surround this history – all of which are highly interwoven with the advancement of the media technologies through which various representations of the Holocaust occur. Moreover, such changes are not evenly distributed across cultures, borders, generations, or countless other distinctions that may affect how one relates to and understands the history of the Holocaust. The continued interest in the intergenerational memory of the Holocaust and the ways it is represented often invokes the wisdom of Santayana's (Reference Santayana1980) famous aphorism, ‘Those who cannot remember the past are condemned to repeat it’ (p. 172) – a logic which actually lies in opposition to Apple's decision. This sentiment is echoed in Martin's (Reference Martin2022) comment, ‘Let's not sanitise the world and forget our past or we risk repeating our mistakes’. Though Apple's measure does not amount to outright censorship as one commentor goes so far as to suggest (Maxis 2022), it calls attention to the difficulties many platforms and policy makers continue to face regarding the evolving representation of this important history in digital and algorithmic technologies.
Questions concerning how to appropriately represent the history of the Holocaust have been the subject of many debates and studies concerning collective memory, particularly following the connective turn (Hirsch Reference Hirsch2012). In 2000, a French court ordered Yahoo! to block French web users from the website after it was found to be auctioning Nazi memorabilia (Hearst Reference Hearst2000). This landmark ruling accused the web provider of trivialising the Holocaust, with a judge proclaiming that Yahoo! had offended the nation's collective memory (Hearst Reference Hearst2000). It was also among the first governance decisions that gave rise to divergent Internet experiences for people based on geolocation. These tensions have persisted and intensified with the growth of social media and the intergenerational and cross-cultural interactions it enables. It was not until 2020 that Facebook and Twitter announced that Holocaust denial would be barred from their social media platforms (Shead Reference Shead2020). Contentious material from locations related to the Holocaust on social media has also drawn mainstream media attention, often showcasing a polarisation of opinion. Some examples include the Daily Mail covering a photo blog that collected profile pictures from the Grindr dating app taken at the Memorial to the Murdered Jews of Europe (Stebner Reference Stebner2013) or the New Yorker covering a Facebook page that collected pictures of teenagers in questionable poses at the Auschwitz-Birkenau Concentration Camp (Margalit Reference Margalit2014). Even the highly specific and relatively recent phenomenon of selfies at Holocaust memorials have drawn media attention and been the subject of academic analysis (Bareither Reference Bareither2021; Dubrofsky Reference Dubrofsky2018; Feldman and Musih Reference Feldman and Musih2023; Zalewska Reference Zalewska2017). Examples like these illustrate the gaps and contradictions in collective memory that exist across cultures and generations, demonstrating that even the remembrance of a historical event as sensitive as the Holocaust is not enacted in a uniform manner. The uniquely personal nature of the ‘sensitive locations’ block further complicates many longstanding debates surrounding the mutual shaping of individual and collective memory, particularly as advancing networked technologies facilitate new spaces for interaction and new ways of remembering.
The Memories feature assumes a somewhat authorial role in determining which personal photographs to resurface and which ones to exclude. Because the feature is targeted towards governing the on-device interactions users have with their personal photographs, it is unlike many other forms of content moderation, curation, or censorship which traditionally aim to guide user interactions with media produced by others. Unfortunate or inappropriate encounters with material related to collective memories or histories more often falls into the domain of content moderation on the Internet through public and social media platforms. Questions and moral dilemmas surrounding how to adequately accommodate the variety of voices and perspectives present on social media while protecting users from the potentially objectionable or offensive content of others are therefore of major concern in the countless debates surrounding Internet governance (Gillespie Reference Gillespie2018). What is peculiar about the ‘sensitive locations’ update is how it repositions these debates within the personal sphere, shielding users from accessing potentially sensitive content that they themselves have produced. This intervention is therefore an important development in the transition from the digital to the algorithmic memory of the Holocaust, as the ability to filter content in users’ personal photo libraries is premised on new levels of access and technical precision. These advanced capabilities are now embedded in the Photos app and have given rise to new possibilities for targeting certain types of content in the Memories feature. Even if it is now possible to prevent users from encountering specific histories in their personal photo libraries, the wisdom of such measures has been called into question:
Apple is free to ‘decide what is best’ when it comes to their offerings to their customers. However, when that spills over into deciding what is best with the content I myself have created, they've crossed a line. I hope they reconsider this move. -booger (2022)
As the commentor above observes, Apple is moving into relatively uncharted territory in the evolving debates surrounding content moderation and Holocaust representation in the algorithmic era. By intervening in this highly personal sphere of interaction, this decision calls attention to the unprecedented capabilities and potential consequences of embedding proprietary algorithms into the contemporary technologies of memory making. Governance decisions on the Internet and social media are typically aimed towards ensuring objectionable content does not surface in users’ interactions with media content produced by others, but guiding interactions with personally produced media is a significant departure from this domain. This new dimension of influence further complicates the increasingly blurry boundary between individual and collective memory in the ongoing transition from digital to algorithmic memory. The gravity of the Holocaust and the importance of its remembrance has been at the core of countless debates surrounding the representation of history in media and its role in shaping collective memory, an evolving relationship that is a reflection of the social, political, and technological changes occurring in and across cultures. By taking a new approach to the algorithmic handling of this sensitive history, Apple's decision illuminates the growing implications of using automated technologies in everyday memory making practices. Through this rare intervention between users and the content that they themselves have produced, Apple is uniquely positioned to influence how users’ individual experiences connect to important histories and collective memories. The authority on which these important decisions are made however remains open to debate. Though users’ recollection of the significant places they have visited is not framed entirely through their personal photographs, omitting these photographs from the ‘Memories’ feature may lessen the impact of these experiences as they are less likely to resurface in users’ subsequent interactions with the Photos app.
The subjectivity of sensitivity
Interesting, I wonder if we will be able to add our own ‘Sensitive Locations’ I can think of a few places I don't want to be reminded of. -Caleb Jones (Reference Jones2022)
As the commentor above suggests, the designation of sensitivity is largely subjective and likely varies from user to user. It is unclear as to why this measure was referred to as a block on ‘sensitive locations’ given the highly specific nature of the history addressed with the initial roll out. This may be attributable to the fact that the designated list of 12 could be expanded or altered in future updates, an aspect of the story that is acknowledged in the original 9to5Mac post (Espósito Reference Espósito2022). The 9to5Mac post's headline, ‘iOS 15.5 beta blocks ‘Sensitive Locations’ for Memories in Photos app’ makes no reference to the Holocaust. Though this decision may deflect critical attention away from the specific history targeted by the block, it also creates space for further reflection upon what the designation of sensitive might mean and how this differs between the individual and the collective. There are countless possible interpretations of the term, and even in the particular context of automated memory many commenters questioned the basis of this description as well as the efficacy of this approach.
In 2016, the same year the Memories feature launched, Apple was granted a patent for a technology that could block iPhones from taking photos or videos in specific locations altogether (Lovejoy Reference Lovejoy2016). 9to5Mac's coverage of this patent was focused on a fairly limited potential application of the technology, preventing concert-goers from holding their phones up and blocking others’ view or stopping video recording in movie theatres, while noting the potential upside of integrating it with augmented reality (AR) in other locations (Lovejoy Reference Lovejoy2016). Just as with the ‘sensitive locations‘ update in 2022, news of this development spread to more critical commentary and coverage which highlighted how dangerous this technology could be. For example, Newman (Reference Newman2016) of Slate magazine noted how easy it would be to disable iPhone cameras at specific locations were users may otherwise be able to capture evidence of wrongdoing such as protests, limiting the need for accountability from private actors or even powerful corporate and state entities. Although this particular technology has not presently been implemented, it foreshadows the potential consequences of accepting the unprecedented access to personal photo libraries certain companies and technologies now possess.
This degree of access is somewhat unique to the Memories feature, which draws from users’ entire photo library as opposed to social media platforms which resurface content users have already chosen to post. This important attribute has been elaborated upon by Jacobsen and Beer (Reference Jacobsen and Beer2021), who identify three distinct approaches to the automated repackaging of memories: apps dedicated to the revisitation of past social media content like Timehop, features embedded in larger platforms like the Facebook Memories feature, and finally those embedded in the smartphone software itself as is the case with Apple Memories. While these memory features all share the common issues surrounding how to appropriately curate and represent users’ most significant memories, Apple faces the added challenge of doing so while drawing from a much larger and presumably less calculated collection of photographs.
Contextual cues led some commentors to imagine what the ‘sensitive locations’ headline might be referring to before even reading the post, ‘At first I thought sensitive locations meant like your home and work so irrelevant pictures didn't get piled into memories, so I got a pretty good surprise when I heard what they actually mean by that’ (josh 2022). Others envisioned a more customisable approach to the block, suggesting locations like hospitals (Langley Reference Langley2022) or bedrooms (jimthing 2022a) be omitted by the feature. In the context of the Photos app and the Memories feature, users are likely to consider what specific content they are aware of in their photo libraries that they would not want to see resurfaced in a slideshow. There are many reasons that photographs from locations specific to the Holocaust might not fall within Apple's preferred approach to representing the past, but this may be due to the narrow range of possible outputs currently offered up by the Memories feature.
One strategy for avoiding inappropriate, insensitive, or untimely encounters with user memories is to orient these features around happy content – but developing algorithms for accurately identifying a user's personal connection to a photograph continues to present major challenges for designers and engineers working in this space. This is because these connections are unique to each user, and their perceptions can shift dramatically over time as personal narratives and understandings change. Some commentors noted that the feature might be improved by accommodating a greater variety of memories, but noted the difficulties designers would face in developing a system that could appropriately identify the emotional resonance of a photograph:
The question is, is the Memories feature just for fun, happy, exciting times, or should/could it be better if they made ones with more melancholic, unhappy, thought provoking times in one's life. And how would coding the latter be effected, as the happier ones are likely easier to do – and also much less fraught with difficulties about what is ‘appropriate’ per each event. - (jimthing 2022b)
The comment above makes the important observation that not all important memories are happy ones, and that an automated feature for resurfacing the past might benefit from giving more difficult (or sensitive) content its time and place. This is a reality that Apple and many others in the automated memory space are currently dealing with, but doing so may require a willingness to face more serious scrutiny for their role in shaping the memories of their users and the degree of access to personal content they require for their operation. Memories is premised on the ability to scan users’ entire photo libraries for specific content, yet potential concerns over privacy have been allayed by the features’ claims to operate securely and entirely on-device (Pereira Reference Pereira2022). Though Apple has long prided itself on its security (Wakabayashi Reference Wakabayashi2014), these claims were under serious scrutiny in 2021 when the company planned the roll out of a Child Sexual Abuse Material (CSAM) detection system for iCloud Photos. A widespread range of criticism from the public, security experts, and even the company's own employees ultimately led to Apple's abandonment of its CSAM detection plans (Lovejoy Reference Lovejoy2023). The 9to5Mac article covering the impact of this controversy noted that even after the company admitted to flaws in the system, it did not address how this feature could be misused if its control fell into the wrong corporate or state hands (Lovejoy Reference Lovejoy2023). Apple's ability to scan users’ personal photo libraries carries serious implications beyond the shaping of memory; it is therefore crucial to interrogate not only the social values embedded in the Memories feature but the potential consequences of its widespread usage.
Sensitive? by what metric?
What's next? The iPhone camera not allowing us to take a picture because it thinks a minor is there without a swimsuit? The voice recorder app not recording because someone is moaning and saying ‘bad words’ to it? Is this where we are heading?. I get to decide if a specific place is a sensitive location or not, NOT an engineer at Apple. These kind of algorithms MUST be neutral, period. The algorithm must sort, group and save photos based on locations and nothing else. I later decide if I keep those locations and automated memories or not. -Kiki
Seems you don't understand what the article said, and naturally turn to doomsday theatrics. -Ginni's Weaselly (2022a, 2022b)
The exchange above demonstrates some of the variance among opinions and interpretations surrounding the ‘sensitive locations’ block. While many users imagined future scenarios wherein the technology could be used to automatically block all sorts of content, others seemed far less concerned. Kiki's speculative examples are indicative of the fears many users must reckon with as they become increasingly aware of the less desirable outcomes and possibilities algorithmic technologies may present. Although the article is specific about the history targeted by the block as well as the technological approach to achieving this aim, readers’ assessment of the potential implications of this move reveal distinct anxieties about the increased influence of algorithms for shaping interactions with personal media and memory. These concerns reflect long histories of tension between structure and agency as well as those between neutrality and personalisation, both of which have surrounded Apple for decades. By limiting access to the internal workings of their software and hardware, Apple's ‘closed world’ strategy has been increasingly scrutinised alongside the company's growth (Fried and Gold Reference Fried and Gold2022). This popular criticism emerged in the ‘sensitive locations’ comments as well: ‘Apple has been doing this for years with all of their products. They want you to use it how THEY see fit’ (Kojack 2022). Such commentary reveals a scepticism towards the lack of freedom afforded to users of the Memories feature, who must rely on Apple's one-size-fits-all metrics for the automated curation of personal content.
Commenter Tom T. openly questions Apple's approach to the ‘sensitive locations’ block, describing it as ‘Very very weird’ before adding ‘I can understand users want to define sensitive locations. But Apple? By what metric?’ (Reference Tom2022). Apple has largely focused its promotion of Memories around the emotive aspects of the feature rather than the technical ones. Nevertheless, when a technology has been promoted for its cutting-edge computer vision and machine learning capabilities (Federighi Reference Federighi2016), the relatively straightforward use of location data to curate memories appears less sophisticated by comparison. In the digital era, a staggering amount of information can be linked to a photograph to assist with and automate organisation and retrieval. While geographic information is useful for the revisitation of personal photographs, it is also a feature that has been available in Exchangeable Image File Format (EXIF) data since 1995 and a growing number of locative social media features since the mid-2000s (Drakopoulou Reference Drakopoulou2017). Commonly known as a ‘geotag’, the geographic information linked with a photograph at the time of its capturing is a well-established practice in digital photography. This additional information is housed in the EXIF data, which usually contains the date and time a photograph was taken (timestamp) as well as ancillary information like camera settings and image metrics. The parameters that enable these 12 sensitive locations to be blocked from Memories do not require any advanced technical knowledge to comprehend, the geotag simply precludes the possibility of future algorithmic resurfacing.
The commentary on the ‘sensitive locations’ article reveals that there is little consensus on Apple Memories intended uses and its functionality remains largely mysterious to many of its users. As one commentor suggests, ‘It seems some people might misunderstand the Memories feature’ (MPD01605 2022), however, perhaps the larger concern is whether or not anybody truly understands the Memories feature. Since Apple has largely refrained from offering technical explanations of how the Memories feature operates, these important details can easily elude even the most engaged users. An important precursor to the iOS 15.5 ‘sensitive locations’ update was the introduction of quick access to EXIF data in the Photos App. Though the iPhone camera has made use of EXIF data since the first users began taking photos in 2007, this information was not directly accessible in the Photos app until the launch of iOS 15 in September 2021. Taken together, these updates indicate an oscillation between increasing user agency through direct access to metadata and curtailing it through the targeted omission of specific photographs. This persistent tension illustrates some of the anxieties in the transition from digital to algorithmic memory, as many users may feel empowered by certain advancements in technologies while remaining largely unaware of their limitations and shortcomings.
The relative simplicity of the ‘sensitive locations’ block can be seen as an important contribution to demystifying aspects of the operations that guide the Apple Memories algorithm. This atypical journalism therefore counteracts the growing tendency to willingly accept the unknowability of algorithmic processing, a phenomenon commonly known as ‘black boxing’. Algorithmic imaginaries are often framed through this limiting functional analogy, even as it shrouds algorithmic systems in secrecy and renders them unnecessarily opaque (Bucher Reference Bucher2018). As Bucher (Reference Bucher2018) suggests, ‘the wide-spread notion of algorithms as black boxes constitutes something of a red herring – that is, a piece of information that distracts from other (perhaps, more pressing) questions and issues to be addressed’ (p. 44). The metrics by which algorithmic judgments of personal content are made are far from neutral. The various algorithms used to identify and sort images in the Memories feature remain bound to the original specifications developed by Apple, and therefore embody certain social and cultural values of the company. As algorithmic systems are designed around data structures based on form and regularity rather than actual content (Dourish Reference Dourish2016), Memories is only ‘personalised’ in the sense that its outputs are drawn from the unique photo collections of its individual users. This obfuscates the fact that the one-size-fits all system is standardised across the entire userbase and can inadvertently reproduce consequential biases or even intentionally target important histories.
Conclusion
As users incrementally relinquish responsibility for organising and retrieving their own photographs, acts of remembrance will increasingly occur through layers of mediation that privilege certain perspectives at the possible expense of others. Apple Memories and the decision to block ‘sensitive locations’ is an important case study for considering the potential consequences of automated memory technologies in shaping both individual and collective memory. It offers an entry point to interrogate the influence of black boxed algorithms, creating space to question the social values embedded in these systems and how they mediate between individual experiences and more significant shared histories or collective memories.
Through its heightened degree of access to personal photo libraries, Apple Memories is uniquely positioned to intervene between users’ individual experiences and collective memories. Blocking ‘sensitive locations’ from the Memories feature demonstrates the potential of the technology for identifying content and targeting outcomes which reflect the historical, social, and cultural sensibilities of the company. The significance of the history omitted and the simplicity of the intervention to accomplish this aim reveal the importance of critically interrogating the algorithmic operation and prospective consequences of automated memory technologies. The range of public opinion on display in the comment section of the 9to5Mac post offers insight into the imaginaries that surround automated memory technologies and the range of perspectives that both legitimate and contest their presence in shaping memory. These comments call attention to the three areas of concern addressed in this article: the limitations of platform intervention into personalised media environments, the subjective nature of interactions with personal media, and the problems of using one-size-fits-all algorithms for previously personal memory acts. Every iOS update carries the potential to dramatically reconfigure how users access their personal photos and come to remember their pasts; it is therefore crucial to interrogate how the values embedded in these systems impact both the individual and collective memories of their users.
Data availability statement
The data that support the findings of this study are openly available on 9to5Mac at https://9to5mac.com/2022/04/26/ios-15-5-beta-blocks-sensitive-locations-for-memories-in-photos-app/
Funding statement
This work received no specific grant from any funding agency, commercial or not-for-profit sectors.
Competing interest
The author declares none.
Chrys Vilvang is a PhD candidate at Concordia University. His research interests include memory, photography, digital archives, algorithms, and artificial intelligence.