<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Lawless Tech]]></title><description><![CDATA[How do we protect freedom in a tech-mediated world?]]></description><link>https://blog.boulos.ca</link><generator>Substack</generator><lastBuildDate>Sun, 12 Apr 2026 07:40:05 GMT</lastBuildDate><atom:link href="https://blog.boulos.ca/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Matthew Boulos]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[matthewboulos@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[matthewboulos@substack.com]]></itunes:email><itunes:name><![CDATA[Matt Boulos]]></itunes:name></itunes:owner><itunes:author><![CDATA[Matt Boulos]]></itunes:author><googleplay:owner><![CDATA[matthewboulos@substack.com]]></googleplay:owner><googleplay:email><![CDATA[matthewboulos@substack.com]]></googleplay:email><googleplay:author><![CDATA[Matt Boulos]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Love, Labour, and Language]]></title><description><![CDATA[AI won&#8217;t change what it means to be human. We need to prepare for what's actually coming.]]></description><link>https://blog.boulos.ca/p/love-labour-and-language</link><guid isPermaLink="false">https://blog.boulos.ca/p/love-labour-and-language</guid><dc:creator><![CDATA[Matt Boulos]]></dc:creator><pubDate>Wed, 01 Apr 2026 19:37:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!7Hju!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!7Hju!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!7Hju!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 424w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 848w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 1272w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!7Hju!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png" width="721" height="378" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:378,&quot;width&quot;:721,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:492721,&quot;alt&quot;:&quot;Three Friends, Xugu (Zhu Huairen) (Chinese, 1823&#8211;1896), 1894. The Metropolitan Museum of Art, New York.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://blog.boulos.ca/i/192880144?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Three Friends, Xugu (Zhu Huairen) (Chinese, 1823&#8211;1896), 1894. The Metropolitan Museum of Art, New York." title="Three Friends, Xugu (Zhu Huairen) (Chinese, 1823&#8211;1896), 1894. The Metropolitan Museum of Art, New York." srcset="https://substackcdn.com/image/fetch/$s_!7Hju!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 424w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 848w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 1272w, https://substackcdn.com/image/fetch/$s_!7Hju!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3166a31b-2d9d-4cf4-9d60-ed25cc34b8e8_721x378.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"></figcaption></figure></div><p>I&#8217;ve noticed that a certain type of writing irritates me more than others. If something&#8217;s obviously poor, it slides me by, and, if it&#8217;s good or validates my beliefs (is there a difference?), I get a warm feeling that quickly fades. What stay are insights powerful enough to change my thinking, and what get jammed between my teeth are the ones that should but miss the landing.</p><p>Writing on Wittgenstein and AI (as one does) in <em>Commonweal</em> <a href="https://www.commonwealmagazine.org/wittgenstein-apocalypse-ludwig-stern-ai-artificial-intelligence-technology">last week</a>, Alexander Stern made the useful point that Wittgenstein made the point that language, particularly for the young, doesn&#8217;t so much as act as a signifier for objective things as it does as a vehicle to link us with each other -- that communication is an act of connection more than it is an act of description.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.boulos.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Lawless Tech! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>The point is profound, and when we allow ourselves to think about what we really do with words, it also, at least to me, feels intuitively powerful. When I talk to my wife or child, yes, there are often things I&#8217;m trying to accomplish (put on your shoes!), but the vastly more significant purpose, and why we choose how we speak to those we love, is simply to be together.</p><p>So, language is certainly more than mere description or instruction. That&#8217;s nice.</p><p>But that doesn&#8217;t then exclude the functional uses of language. Spend time working in the business world or writing code and it becomes clear how useful it is to reduce language to the crassness of a delineated task.</p><p>Talk to an experienced software engineer and you&#8217;ll learn how radically they&#8217;re changing what they do. They typically haven&#8217;t surrendered their pride in the craft of writing elegant (or ugly but useful) code, but, now, more like a partner at a professional services firm, they&#8217;re using language to induce machines to produce more language to get things done, and, because so much of what gets done in modern work is done digitally, the power is rather intoxicating.</p><p>And so we have language, generated by humans and machines, received by humans and machines, both furnishing the profoundest bonds and executing pathetic tasks, and this reality seems to elide an awful lot of people when it comes to talking about what AI will do to our world.</p><p>Stern is right that we&#8217;ll continue to want connection with each other -- what a funny thing to write, why wouldn&#8217;t we? -- and at the same time we&#8217;re going to automate an astonishing amount of work -- if it can be captured with language, it&#8217;s susceptible.</p><p>And those two facts will be as deeply intertwined in this era as operating a steam engine and grousing with one&#8217;s neighbour were in a previous era. People, to the extent their agency allows, will still choose who they want to associate with in their work and I will assuredly unleash any machine that lets me ignore the administrative details of my life without steering my family into a ditch.</p><p>The fantasy that work might be entirely taken over by machines is all a bit silly. Someone needs to take the blame (or the credit) for whatever the machines do -- otherwise, what&#8217;s the point? The deeper question is about who&#8217;s in charge, who benefits, and what happens to everyone else as we build systems this catastrophically destabilizing.</p><p>We&#8217;re in for a ride. If what I&#8217;m seeing in the world of software extends similarly to other areas of working life (and I think it will), we&#8217;re going to feel the forces against our skin. These may be matters of degree, but degrees can be severe.</p><p>We&#8217;ll still work. We&#8217;ll still have our humanity. And, as a Jay Caspian Kang <a href="https://www.newyorker.com/news/our-columnists/whats-the-point-of-reading-writing-by-humans">wrote</a> not long after ChatGPT showed up, we&#8217;ll still fuss about what someone has to say, because it&#8217;s fun to be mad at them for saying it. It&#8217;s time we stopped acting like human nature is in the balance and started properly asking how we&#8217;ll care for each other when everything is upside down.</p><p></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.boulos.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Lawless Tech! Subscribe to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Digital freedom depends on access rights ]]></title><description><![CDATA[Platforms arbitrarily block AI agents. The implications go beyond consumer choice to liberty itself.]]></description><link>https://blog.boulos.ca/p/digital-freedom-depends-on-access</link><guid isPermaLink="false">https://blog.boulos.ca/p/digital-freedom-depends-on-access</guid><dc:creator><![CDATA[Matt Boulos]]></dc:creator><pubDate>Mon, 15 Dec 2025 19:44:49 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!RuOp!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F989a6b69-7601-4ceb-ae54-7e0ac269a038_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><p><em>This piece was originally published in </em><span class="mention-wrap" data-attrs="{&quot;name&quot;:&quot;Lawfare&quot;,&quot;id&quot;:24182539,&quot;type&quot;:&quot;user&quot;,&quot;url&quot;:null,&quot;photo_url&quot;:&quot;https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fbce66d59-bd71-4843-baf9-90d09cad0340_4800x4800.png&quot;,&quot;uuid&quot;:&quot;a2d67e36-20e8-464f-be2e-5cd14238ce35&quot;}" data-component-name="MentionToDOM"></span> <em>on December 11, 2025 as </em><a href="https://www.lawfaremedia.org/article/digital-freedom-depends-on-access-rights">Digital Freedom Depends on Access Rights</a><em>. Special thanks to Olivia Manes and the Lawfare team for all their help.</em></p><p><em>I&#8217;d love to hear your thoughts and reactions.</em></p><p></p><div><hr></div><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.boulos.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Lawless Tech! Subscribe to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><div><hr></div><p></p><p>Twenty-five years ago, legal scholar <a href="https://www.harvardmagazine.com/2000/01/code-is-law-html">Lawrence Lessig warned</a> that the architecture of technology&#8212;what he called &#8220;code&#8221;&#8212;could become one of the greatest threats to modern liberty, if not for the potential of markets and norms to moderate that risk. He turned out to be half right: Code has since done what he feared it might, but markets have only amplified these effects, and norms have done little to constrain them.</p><p>We&#8217;re now left with law as a last resort, but it may be enough to secure a future in which AI agents can act as a counterbalance to tech&#8217;s current trajectory. AI agents&#8212;systems that work intelligently and autonomously for a user&#8212;are the next step after large language models (LLMs). They work by turning the output of those models into an intelligible sequence of real actions. We now have a critical choice to make about what they will become: The legal framework we give them will either mean that this individualized power tempers the current lawlessness of our digital lives, or render us all the more subject to it.</p><p>Take privacy law, for example. Over the past century, it strengthened along parallel tracks: From the protection of sealed mail in 1917 to the unanimous warrant requirement of <a href="https://www.oyez.org/cases/1971/70-153">United States v. U.S. District Court</a> in 1972, an understanding eventually emerged that governments couldn&#8217;t intrude into private lives without justification. Similarly, <a href="https://scholarship.law.gwu.edu/faculty_publications/955/">William Prosser&#8217;s torts</a> in the 1960s became the foundation of private claims to be let alone. But now, in the face of digital technology, those gains are eroding.</p><p>As our lives move from spaces governed by government to spaces governed by private actors, the distinction between public and private is collapsing&#8212;imperiling our rights within each category. On the public front, the <a href="https://www.washingtonpost.com/business/2025/05/07/doge-government-data-immigration-social-security/">federal government has worked to merge datasets</a> holding highly sensitive information, making them readily available to its workers. On the private front, <a href="https://www.ftc.gov/news-events/news/press-releases/2024/09/ftc-staff-report-finds-large-social-media-video-streaming-companies-have-engaged-vast-surveillance">for-profit data gathering has become endemic</a>&#8212;yet Prosser&#8217;s torts have not been updated, nor federal law instantiated, to account for a completely distinct mode of private intrusion.</p><p>Because private actors control the digital spaces of our lives and set their rules, governments turn to them to do things they can&#8217;t on their own: Until public scrutiny <a href="https://www.404media.co/airlines-will-shut-down-program-that-sold-your-flights-records-to-government/">shut the program down</a>, law enforcement had been able to <a href="https://www.404media.co/airlines-sell-5-billion-plane-ticket-records-to-the-government-for-warrantless-searching/">ignore warrant requirements</a> for personal flight data by buying it in advance from airlines, and the Department of Justice can <a href="https://www.businessinsider.com/apple-iceblock-app-store-removed-2025-10">censor the speech of private citizens</a> by asking Apple to remove apps it doesn&#8217;t like from the App Store. Private actors have power akin to government, and use it as they see fit.</p><p>These same capabilities that abet massive information collection also enable centralization and control, which is how we&#8217;ve ended up in a world where credit card processors <a href="https://www.pcgamer.com/software/platforms/valve-confirms-credit-card-companies-pressured-it-to-delist-certain-adult-games-from-steam/">decide in practice</a> the categories that can or can&#8217;t be sold online and social media companies almost uniformly employ practices meant to hijack our attention and addict users.</p><p>The point isn&#8217;t that technology companies are misbehaving (though they often are)&#8212;it&#8217;s that their control of digital technologies gives them an unprecedented amount of power over our lives. We must find a way to wrest it back if we want to be free. AI agents provide a possible solution.</p><p></p><p><strong>The Role of AI Agents</strong></p><p>Consider these competing visions. A social media feed is an AI system <a href="https://hbr.org/2022/11/our-social-media-addiction">trained to addict you</a>; an airline&#8217;s pricing algorithm is often an AI system that <a href="https://hls.harvard.edu/today/how-delta-airlines-and-other-companies-use-dynamic-pricing-to-determine-how-much-you-pay/">dynamically changes fares to maximize your spending</a>. They give power to platforms. A loyal AI agent does the opposite: It filters your social media feed as you prefer or shops flights across times and airlines to tilt the negotiating balance back in your favor. Without AI agents pushing back in this way, predatory AI systems will deepen their entrenchment of the former picture.</p><p>A specific category of AI agents called coding agents create software (instead of acting directly on a user&#8217;s behalf). In a few years&#8217; time, most people will be able to quickly and cheaply generate custom software for themselves and others. But useful software depends on interactions with other systems and your data. The current rules, as set by the platforms, mean that this access will be capricious at best.</p><p>This, in turn, means that software that conflicts with the intentions of the major tech platforms will be blocked. As it stands, <a href="https://epic.org/issues/consumer-privacy/data-brokers/">data brokers can package and sell your data</a> and you have no federal right to intervene. Health providers <a href="https://www.congress.gov/crs-product/R48570">obfuscate costs</a> so that you cannot shop for the care you can afford. Content and social media platforms can permanently hold your content, without a right of rescission.</p><p>Platforms are using their power to <a href="https://www.theguardian.com/technology/2025/nov/18/amazon-vs-perplexity-the-ai-agent-war-has-arrived">deny personal AI agents access</a> to the systems and data they need to work. Amazon <a href="https://www.mediapost.com/publications/article/407835/amazon-shopify-seen-using-code-to-block-ai-agents.html">blocks ChatGPT</a> and other AI tools. Salesforce changed its terms of service to <a href="https://www.reuters.com/business/salesforce-blocks-ai-rivals-using-slack-data-information-reports-2025-06-11/">prevent companies like Glean</a>, an AI search tool, from using data from Slack (which Salesforce owns), while negotiating deals for preferential rights to data on other platforms. This appears to be an ideal arrangement for tech&#8217;s giants: Shutting down challengers while using their market dominance to re-create the products they just crushed.</p><p>This was the core insight of the <a href="https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-markets-act-ensuring-fair-and-open-digital-markets_en">EU&#8217;s Digital Markets Act</a>. Whatever its flaws, it squarely recognized that this kind of power translates into a form of private government, and that a citizenry without the capacity to resist cannot be said to be free. When our interests are aligned with a platform, this is less of an issue; but when they diverge, the human stakes can be exceedingly high. When children are driven to <a href="https://www.fastcompany.com/91407443/big-tech-liable-chatbot-suicide-cases">suicide by chatbots</a> whose operators disclaim responsibility or private data collection creates the conditions for warrantless searches and detentions, liberty is on the line.</p><p>Narrowing our choices also creates bridges to other harms. Our data is used against us in an increasingly lopsided manner, especially when we can be individually targeted and algorithms render us legible to a degree that companies would be embarrassed to publicly admit.</p><p>Moreover, the legal tools that might have worked elsewhere are no longer effective in this sphere. Antitrust, to the extent that enforcement and remedies have any bite, corrects only for the abuses of monopoly. But the denial of access that interoperability rights are meant to reverse is an industrywide practice that has been sustained by convention and perpetuated by convenience and power. It&#8217;s not a consumer choice problem&#8212;at least not among platforms&#8212;when what&#8217;s lost isn&#8217;t the existence of a single interoperable platform, but the broad-based ability to exercise agency anywhere in our digital lives.</p><p>Similarly, non-discrimination rights&#8212;the primary tools to remedy denial of access in the offline world&#8212;are inert here, because this erosion of rights does not discriminate. Some of the impacts may be disparate in their effect, but this is a broad-based disenfranchisement. Focusing only on protected categories would be both oddly perverse and exceedingly hard to demonstrate absent the access rights that we&#8217;re fighting for in the first place.</p><p>What we need instead is a specific right of access to both our data and the systems that we depend on for digital life, so that platforms cannot discriminate against anyone on the basis of the tools they use for access, or against our desire to ensure that the data they have about us is what we want them to hold. It&#8217;s a frank departure from the present moment&#8217;s free-for-all, where platforms have very few obligations with regard to data access in the U.S. and where they can use their terms of service to legally prohibit the use of third-party software&#8212;which naturally encompasses AI agents&#8212;to access their services.</p><p></p><p><strong>The Regulatory Landscape</strong></p><p>The most prominent interoperability and data access regulations under consideration by Congress or at the state level make no surrender on this front. The bipartisan <a href="https://www.congress.gov/bill/119th-congress/senate-bill/1634/text">ACCESS Act</a>, which was reintroduced by Sen. Mark Warner (D-Va.) and co-sponsored by Sens. Josh Hawley (R-Mo.) and Richard Blumenthal (D-Conn.), requires that platforms above a certain threshold create interfaces that third-party agents can use to evaluate and potentially remove personal data that has been collected by those platforms. It has been designed specifically to ensure that platforms are unencumbered in protecting users. <a href="https://www.nysenate.gov/legislation/bills/2025/S7476">Other bills</a> being considered at the state level more directly encode a right of access.</p><p>Absent these rights, users are vulnerable to the whims of the platforms. It&#8217;s worth recalling that the ACCESS Act was first drafted in response to the <a href="https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-election">Cambridge Analytica scandal</a>, in which a consulting firm gained unauthorized access to the data of millions of Facebook users to target them for political advertising.</p><p>But what about the rights and claims of the platforms themselves? Worries about patchwork state regulation are valid, but the claim that over a thousand state AI bills have been passed is <a href="https://stevenadler.substack.com/p/mythbusting-the-supposed-1000-ai">questionable</a>, and the active funding of efforts to kill federal regulation, like the <a href="https://fortune.com/2025/08/26/openai-president-greg-brockman-andreessen-horowitz-super-pac-ai-pro-innovation/">Meta and OpenAI anti-regulatory PACs</a>, show that the claim is being made in bad faith.</p><p>There is no such thing as a neutral regulatory environment. <a href="https://www.congress.gov/crs-product/R46751">Section 230</a> of the Communications Decency Act prefigured the modern internet. With the swipe of a holding, the Supreme Court <a href="https://www.scotusblog.com/2024/06/supreme-court-strikes-down-chevron-curtailing-power-of-federal-agencies/">ended the Chevron doctrine</a> and narrowed the scope of federal agency decision-making. As Lessig noted, &#8220;When the interests of government are gone, other interests take their place.&#8221; We won&#8217;t have the capacity to represent our own interests against tech platforms if we buy the idea that government intervention is always bad (and private interests are always good).</p><p>This belief puts the interests of tech founders, investors, and some of their workers above those of their fellow citizens. It pretends away the incursions to liberty that come when we lose the cover of responsible government as we move through digital life. And it fails to recognize that a much richer vision of technology and society becomes possible when we protect fundamental rights while leaving markets to work.</p><p></p><p><strong>***</strong></p><p></p><p>We are at a crossroads. AI agents are giving individuals powers once exclusive to large tech platforms. What legal framework will govern them? Inaction&#8212;the current default&#8212;lets platforms decide, and the decades of experience since Lessig&#8217;s alarm show they will not prioritize personal freedom. The alternative is to learn from the internet&#8217;s early successes: open protocols and shared data that provide autonomy. We can establish structures that let personal AI agents help us reclaim control over the digital spaces where we increasingly live, work, and commune. Choosing not to will mean surrendering our freedom as we make one of our largest steps yet into the digital future.</p><p></p><div><hr></div><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://blog.boulos.ca/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading Lawless Tech! Subscribe to receive new posts.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item></channel></rss>