I do not like these framings of "not a" because it always sounds so suspicious like "we are not a cult".
It puts the idea into the world that it could be a crime and maybe that it is the status quo.
Much better IMHO is something like "Encryption is a fundamental right.", "Encryption protects everyone.", "Without encryption there is no democracy." and so on.
Maybe "Don’t let them take your right to privacy."
I don’t understand this. If you live in the US and use a service like ProtonMail, has a crime been committed? Are there any examples here in the US or anywhere else of arrests/prosecutions being made over encryption? I’ve never heard of any??
Also, I've heard it said that people have a tendency to subconsciously flush "not" and remember that sort of statement as "encryption is a crime". It is slightly better to put things positively (eg, "Encryption is the reasonable default").
It makes sense in this context, as it operates on the idea that it could be a crime: "Contrary to what some policymakers seem to believe, whether naively or maliciously, encryption is not a crime."
Claiming a right is not the same as exercising the right, though typically the claim precedes the exercise. Also, are you suggesting you don't have the right to use encrypted communications? (assuming you're not in France, where the "right" to encrypted communications has never been explicitly recognized.)
I think you also have to delete the original information before it becomes a crime. Though monkeying around on someone elses' machine(s) without permission is criminal behaviour in most jurisdictions I've looked at.
Not all speech is non-criminal, at least in the US. Inciting speech has always been subject to prior-restraint and fraudulent commercial speech has always subjected the speaker to legal peril.
I think this is a perception formed by media biases. Pretty much any right or freedom evaluated on an individual basis will show that rights and freedoms have expanded (at least up to a few months ago). Many of the negative things being done today have been done in one form or another for a generation or two. I'm not saying that they're right or shouldn't change, just that the perspective of eroding freedoms or right is generally not true outside of business regulations.
No doubt 'perception formed by media biases' has played a significant part and it's been exacerbated by modern communications and social media etc. but I'd contend there's more to it than just that.
What I didn't mention was that I've been to the US many times and I've relatives there, and I've even worked there and these factors have also influenced my perception.
Let me put it this way, if the Greatest Generation, aka the G.I. Generation were to come back today and saw what has happened they'd not only be dismayed but horrified. Right, much of that reaction is to be expected with intergenerational change, etc. but again I'd suggest it's more than that.
It's not possible for me to even begin to justify what I've said as even a précis would take me many pages. Instead, I'd refer you to journalist Tom Brokaw's 1998 book The Greatest Generation wherein he describes the values and beliefs of the people of this generation as well as the ethos of the era in which they lived. Far be it for me to tell American society what it ought to be doing but I'm of the opinion it wouldn't be a bad idea if all Americans read this book—after all, it's actually Brokaw who's making the suggestion that his countrymen read the book or he wouldn't have written it.
In short, Brokaw wrote this book because he sensed the same change in US society as I had done and no doubt much more acutely so. I'll now extrapolate: it's now over quarter century since he wrote it and I'd contend the contrast to which he referred is now even more extreme.
BTW, just don't take my word, I'd suggest you search out some of the book's reviews.
Incidentally, when I was working in NY some decades ago I shared my office with a GI of that generation and he became a great friend. I had many discussions with him about his past experiences. I consider it great privilege to have known him (his name would be familiar to some of you).
It probably was never the country you once knew. The view of "american freedoms" is radically different depending on the viewer's position in the socioeconomic hierarchy.
> I wish Americans still believed in American freedoms
I wish people understood the American system at a philosophical level. What you call "American freedoms" are largely based off of negative rights, i.e. John Locke. Our bill of rights use specific language like "Congress shall make no law", "shall not be infringed", "shall not be violated". It's inherently freedom from state action.
Over the past 100 years a different interpretation of rights has emerged, so called positive rights as exemplified in FDRs second bill of rights; e.g. "the right to a good education" or "the right to earn enough to provide adequate food and clothing and recreation". This requires state action to facilitate freedoms for its citizens.
Unfortunately these systems are incompatible. I think a lot of the friction we are seeing in modern times can partially be traced to this contradiction.
"Unfortunately these systems are incompatible. I think a lot of the friction we are seeing in modern times can partially be traced to this contradiction."
I'm pretty certain you're correct but I won't attempt to justify it detail here as we have to bring out the political philosophy texts on mass.
In the light of the English Civil War many thought about politics and freedoms Locke being one, his contemporary [almost] Thomas Hobbes with a different position—the Leviathan. Rights, freedoms and social contract theory was still raging nearly a century later with Rousseau whingeing about man being born free but everywhere he's in chains—opening line of the Social Contract. And there's still no universally agreed consensus.
Over the centuries political philosophy has covered almost every conceivable interpretation/position about the rights and powers of the State versus individual freedoms, so it's not for the want of options/choices. Dichotomies still remain because the citizenry is composed of people with wide range of political beliefs many of which are incompatible (this has always been the situation).
I'm not sure the US population ever really believed in fundamental freedoms.
They had an apartheid up to 60 years ago. There are living people from that time, and you can't believe in any human right and have an apartheid at the same time.
People believe in logical inconsistencies all of the time, it’s practically the default. Also there is no such thing as perfect freedom, it’s best thought of as an optimization problem with many dimensions.
As an example, the civil rights act necessarily curtails the freedom of association.
For many of us outside the US there's a dichotomy here. The North won the bitterly contested Civil War and freed the slaves but never really afforded them true freedoms. Why?
The perception from the outside is that conscience over slavery per se drove the North to war and not concern for the fact that slaves were actually people who were suffering enslavement and or unfairly treated.
Edit: Given the Civil War why was the Civil Rights Movement 100 years later necessary?
> You are suggesting that family courts in Colorado should be barred from hearing info from psychologists about the impact of dead-naming?
Classic strawman argument. Where was anything like this suggested? Are they barred today under existing legislation?
The eventuality of this line of reasoning is that: "special interests who control the DSM (or whatever standards body governs these soft sciences) can influence and determine the outcome of custody battles."
DSM-4 defined "gender-identity disorder" as a thing, that's now been de-pathologized to "gender dysphoria."
Under your framework, a body of unelected, politically and financially-motivated "experts" can now determine the imposition of legal consequences on a whim.
There was a question mark at the end of the sentence, indicating a question.
"Are you suggesting that family courts in Colorado should be barred from hearing info from psychologists about the impact of dead-naming?"
It is a simple question. I think it tells more about your viewpoint than you may think that you consider discussions of trans issues "a strawman."
I also appreciate how you have decided that you know my thoughts on a complex subject simply by me asking you to provide more detail as to what you were saying.
It's entirely possible I'm not attacking you. It's possible I don't understand what you're saying.
This is too many words to convince someone who already doesn’t believe this.
Put more simply: the modern internet doesn’t work without encryption, it is a fundamental part of the technology. Without it, anyone could log into any of your accounts, take your money, messages, photos, anything.
>Put more simply: the modern internet doesn’t work without encryption, it is a fundamental part of the technology. Without it, anyone could log into any of your accounts, take your money, messages, photos, anything.
I'm pretty pro encryption, but even this is pretty dishonest. Phones (ie. PSTN, not iPhones) aren't "encrypted" by any means, but there's plenty of sensitive information sent over it. Lawyers fax each other important documents, and doctors fax each other medical recorcds. There was (is?) even telephone banking where you could do basic transactions over the phone. Even today, some banks/brokerages require you to phone in to do certain high risk operations (eg. high value transfers or account resets). All of this happens without encryption. While that's less security that I'd like, it's safe to say that "anyone could log into any of your accounts, take your money, messages, photos, anything" isn't true either.
I’m not saying every layer of the onion is individually encrypted. But there are plenty of layers that are.
There is plenty of encryption used when you send any sort of message from an iPhone, even SMS. You can’t even turn the dang thing on and unlock it without encryption. Then when you send it, it’ll be encrypted by the radio before transmission. Then in transit it may or may not be encrypted at various points.
And POTS is not the internet.
My overall point is that encryption is used all of the time when people use the internet for routine tasks that they expect to work, and would not work in a modern reasonable way without it.
People use these technical implementations details to muddy the water of this conversation and demonize encryption, when the reality is that everyone uses it literally all the time for almost everything.
>There is plenty of encryption used when you send any sort of message from an iPhone, even SMS. You can’t even turn the dang thing on and unlock it without encryption. Then when you send it, it’ll be encrypted by the radio before transmission. Then in transit it may or may not be encrypted at various points.
If your argument for encryption is "we need encryption because if it's banned overnight all our phones will turn into bricks!", then yeah sure I guess it's true. But even the diehard encryption opponents aren't arguing for this. My point is that you can very much have no encryption, but not "anyone could log into any of your accounts, take your money ...".
Colloquially, there is a perception among some that encryption is a thing that only the military, criminals, etc use.
Many people are unaware that they use it in everyday life.
If you listen to discussions on this topic outside of technical forums, this perception is not uncommon. It’s important to be clear to laypeople about the ubiquity of encryption, because they are the majority of voters.
Another aspect is traditionally the administrative burden for state actors to receive permission to eavesdrop on POTS technology is relatively high. Or at least it was before the Patriot Act. I would argue it is still higher than eavesdropping on modern digital communications (IPCMS, Email, web browsing, etc.)
Allowance for using faxes to send protected health information (PHI) as defined under HIPAA was essentially grandfathered in for practical reasons, not because it is at all a secure enough communications system for sensitive data. If faxing medical records had been banned then the healthcare system would have come to a halt, which would have been worse then the privacy risk. But if fax was invented as something new today it would never be allowed for PHI.
It's only recently that more secure alternatives to faxing have become practical, like DirectTrust Secure Direct Messaging.
1. How often are people saying their bank login on their phone calls?
2. Is there a way for phone call man in the middlers to get that info without wasting a ton of time listening to calls? With internet MITM it is very easy to set up a program that scrapes unencrypted login info.
>1. How often are people saying their bank login on their phone calls?
Have you ever called into a bank or brokerage? Most ask "security questions", often ones that you can't even choose, like your address or how many accounts you have with them. It's arguably far worse than speaking your password into the phone.
>2. Is there a way for phone call man in the middlers to get that info without wasting a ton of time listening to calls?
Automated speech recognition has been around for decades. Even before that signals intelligence agencies have shown that widespread wiretapping/eavesdropping is possible and effective.
Its about threat levels I guess. I was man in the middling passwords in high school without knowing much of anything about technology. Setting up speech recognition alone is a task that most people are incapable of doing. If youre worried about the type of people who can set that up you probably shouldnt be giving that info out on a phone call.
“Setting up speech recognition alone is a task that most people are incapable of doing”
If you were MITM is HS- your modern day equivalent is way stronger than you think. Easy for kids to clone voices and deepfake these days. Anybody can ask any one of the free chatbots out there for a step by step guide to implement this- they will even write the Python script for you, tell you what IDE to download and how to run it out of the terminal.
That’s exactly the pedantry that muddies the water and confuses people on this issue. Colloquially, it is a distinction without a difference. The internet as normal people know it does not work without encryption.
End-to-encryption is a good thing, and so is this website providing information about how to use it.
But this particular article represents a particular pathology surrounding freedom. Freedom is supposed to be about doing what you want. It's not about making florid speeches about how free you supposedly are. If you want to use end-to-end encryption, just use it, and maybe offer advice to others on how to use it.
There are some politicians who have decided that only bad people use encryption. Going up to one of these politicians and trying to explain that you use encryption but you're actually a good person won't convince them that encryption's okay, it'll just convince them that you're a bad person. Politics is one of those things that attracts people who just want to find the shortest route to a decision about who are the good people and who are the bad people, and keeping secrets isn't something that those sorts of people like other people doing.
Unless you have evidence that the government is rounding up people just for using encryption, all this sort of advocacy does is to draw attention to you having something to hide, and therefore probably being some sort of wrong'un. If the government is rounding up people for using encryption, that's a specific threat you need to respond to, and starting a public campaign is not the right response.
>You should engage with the arguments the other side makes.
The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
Let's not ignore the full history here. That is a bad faith argument. It was a crime to use expensive encryption 30 years ago, but a lot of decisions were made to allow it. Today, every single one of those old caveats about child porn, drugs, money laundering, terrorism, (both domestic and international) and criminal acts in general all have stories where weaker encryption would have saved hundreds and hundreds of lives. We have to recognize this or we're just arguing past each other.
I'm not sure how this is answers my question at all.
How many whistleblowers would have been killed without a secure way to blow the whistle? How many journalists and journalist sources would have been killed? Etc. These people aren't using the USPS for good reason.
Point being, you are only doing one side of your calculation and presenting it as a full argument. But it's just a bad argument unless you calculate both sides.
That's a different question. You asked how many people would have been harmed if weaker/no encryption was the standard. The USPS is a message system where federal employees are able to intercept suspicious content, and there is no built-in encryption for mail. Voting by mail is a great example of how a critical message can be sent without relying on encryption. Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
I don't really want to hash this same thing out for the... At least hundredth time. You're not going to convince me, I'm not going to convince you, and we'll both just leave less happy if we keep going.
>Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
The only thing I want to highlight for your consideration is that the USA is not the entire world. The USPS, even if it were perfect, does not exist in the overwhelming majority of the world. People talk to people across borders.
(Also, with some of the proposed laws, encrypting the USB would be illegal)
And no service offering encryption has existed since 1755? Because that is required for your argument. Otherwise you simply send unimportant stuff via USPS and sensitive/secret/important stuff via non-USPS.
My vote, my taxes, my REAL-ID driver license, passport, credit cards, phone SIM, checks, 401k statements, etc. have very recently been sent via USPS. Do you consider this unimportant stuff?
A bit of a nit-pick. 30 years ago was 1995. It was not a crime to use PGP in the US in 1995. What PKZ was charged with was exporting the encryption technology (or allowing it to export itself by putting the code on an anonymous FTP server.) The Bernstein case was similar in that it was the export of the machine-readable code the government objected to, not it's domestic distribution. The right for researchers to publish text describing algorithms had earlier been recognized by the court (which is why Martin Gardner could describe the RSA cryptosystem in Scientific American in 1977.)
>> You should engage with the arguments the other side makes.
> The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
> Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
Oh come on. Why do you think a "purely emotional arguments" are illegitimate? Are you some galaxy brain, coldly observing humanity from some ivory tower constructed of pure software?
Nearly all positions people take are, at their core, "emotional." And the disagreements that result in "arguments" are often really about differing values and priorities. You might value your "freedom" more than anything and are willing to tolerate a lot of bad stuff to preserve strong encryption, some other guy might be so bothered by child sexual abuse that he wants to give it no encrypted corner to hide in. You're both being emotional.
Those are both reasoned arguments. The emotional argument would be "some guy is so bothered by sexual abuse he wants to ban lightbulbs because once he heard about a lightbulb in the context of an abuse". The "solution" is not really a solution, but the emotional person does not really care about solutions, he's too emotional to think straight.
>The the GP was using "emotional" to dismiss the kind of arguments
I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening. Such as the three examples I gave, which are, in almost every case, used to win an argument without having to consider any possible nuance of the situation.
Often, it's a completely thought-stopping appeal, because everything is simply countered with "so you don't care about children". Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
Apparently that makes me galaxy-brained or whatever, though. ¯\_(ツ)_/¯.
> I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening.
My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
> Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
That's not what I was doing. I was giving an example to show it's a trade-off driven by priorities and values. But if you want to be super-logical about it, supporting strong privacy-preserving uses of encryption necessarily implies a certain level of tolerance for CSAM, unless you support other draconian measures that are incompatible with privacy. Privacy can be used for good and bad.
>My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
There is a distinct difference between a person having emotions while arguing, and using an appeal to emotion as a rhetorical tactic. I do not agree that "pretty much all arguments" contain an appeal to emotion (again, as a purposeful fallacious rhetorical tactic), even though all arguments obviously will have people feeling some sort of emotion.
Even looking through this entire thread, most of the disagreements here do not contain appeals to emotions.
I'm sure that any book on logic and rhetoric from the last few centuries would explain it better than I can. The wiki page has some good explanations and examples as well.
For what its worth the anti-encryption/anti-privacy laws have caught terrorists in the UK. My company provides data storage for their dragnet and handles various requests and Ive seen first hand 4 different instances where the UK gov watching everyones internet activity led to terrorists being caught.
> anti-encryption/anti-privacy laws have caught terrorists
This is undoubtedly so; but much turns on the trust in government. In this U.S., the president, himself a documented profligate liar, just invited an equally untrustworthy unelected person into the halls of government to vacuum up whatever data he pleased. Maybe trust in the UK government is higher.
Collecting data is often not the problem. The problem is how to evaluate it and use it to direct the use of finite law enforcement or counterintelligence resources.
But to your point, let's not forget congressional republicans rushing a SKIF on capitol hill with their mobile devices out in clear violation of policy (and common sense.) I am relieved by the fact that Trump and Musk do not seem to understand what they can use sensitive information for (other than perhaps to sell or give away to foreign governments and businesses.)
I think my point is good intelligence comes from stitching together numerous data points and often traffic analysis is as good (or better) than content analysis. And maybe that the overwhelming majority of elected officials have no conception of how intelligence is collected and evaluated.
Low hanging fruit. The smart ones likely aren't being caught now.
Moreover, it's only a matter of time until the criminal fraternity all catch up and are on the same wavelength. That's when all but the dumbest know exactly what not to do or say on the net.
The Internet is still comparatively young and like everyone else those who've evil intent are still learning exactly how it works. I'd bet money that it won't be long before a 'bestseller tome' of definitive what-not-to-dos cirulates amongst this mob.
The question is at what level will law enforcement's catch have to fall before it has to turn to other methods.
This number by itself means nothing as the other variables are unknown.
How many terrorists were not caught by these systems?
How many would have actually done these actions instead of just talking about it? How many could have been caught with just standard police work?
Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I wouldnt go as far as saying it means nothing, but I agree that the story certainly isnt simple. Was just pointing out that "catch terrorists" isnt a purely emotional argument. Would the terrorists be caught anyway? We'll never know, but theres no way you can say they would for certain. Personally I dont think catching a few terrorists is worth giving up privacy but other people disagree.
> Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I dont think we can ever figure this out since no one is willing to run an rct when it comes to counter terrorism
> Clearly the pressure on government to write these laws is coming from somewhere
Software surveillance vendors.
> Chat control: EU Ombudsman criticises revolving door between Europol and chat control tech lobbyist Thorn
> Breyer welcomes the outcome: “When a former Europol employee sells their internal knowledge and contacts for the purpose of lobbying personally known EU Commission staff, this is exactly what must be prevented. Since the revelation of ‘Chatcontrol-Gate,’ we know that the EU’s chat control proposal is ultimately a product of lobbying by an international surveillance-industrial complex. To ensure this never happens again, the surveillance lobbying swamp must be drained.”
The problem is LEOs (and associated industry) claiming that enforcement is impossible without the ability to obtain cleartext.
This is a lie: obtaining cleartext just makes enforcement vastly easier and more scalable. If crims have encrypted mobile phones, you can still point a microphone at them.
Honestly, I had always assumed LEO wanted access to decrypted message content so they could sell it to advertisers. I mean sure, you could catch a criminal or two, but with all that non-criminal data, just imagine how much off-the-books revenue you could accrue by selling it to the AdWords guys.
The other side being, for instance, the surveillance lobby that pushes for chat control laws in the EU? The "arguments the other side makes" are pretty clear at this point, and nothing to do with the "think about the kids" really, not sure engaging with them is the point.
> Something is a crime if society determines that it should be so. Nothing more.
According to The New Oxford Companion to Law, the term crime does not, in modern criminal law, have any simple and universally accepted definition.
Society also determined it was ok to use a firehose on black people, so I think the best we can say is that the term Crime has nothing to do with Morality, and people who conflate the two need to be looked at with suspicion.
> You should engage with the arguments the other side makes.
I don't. I think most arguments about crime require one-side to act in bad-faith. After all: The author doesn't actually mean that Encryption isn't illegal in some jurisdictions, they mean that it shouldn't be. You know this. I know this. And yet you really think someone needs your tautological definition of crime? I don't believe you.
The arguments are mostly that they dislike what can be accomplished via math. “The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia” isn't exactly an 'argument' so much as an insistence.
The article does address the flaws in some of their arguments (encryption inconveniences law enforcement, think of the children) by pointing out that the average person and children are kept save from criminal elements by encryption.
The arguments from the other side are of the "think of the children" and "tough on crime" variety. They are purely emotional and if you try to dispute them they just respond with "so you don't care about children?". It's like trying to argue with a religious person on matters of faith, you're just not very likely to convince them.
If we had trustworthy governments, or trustworthy police agencies, then maybe mandated backdoors wouldn't be all that bad. But if anything, recent events that clearly demonstrated that governments are not trustworthy, even if one is trustworthy today it couldn't become an evil regime tomorrow, and handing all your power over literally anything to such an organization does not seem wise.
It doesn't seem like trustworthy governments is the issue. You can't have backdoors period because they'll be leaked / discovered and used by bad actors.
That too. But even if the government was perfect and trustworthy and free of leaks, that can still all go out the window as soon as a less trustworthy government is elected.
I have yet to see a case against someone that hinged on some data that was encrypted. Almost every tale from some cell needing to be cracked has ended in a fart because they got the information anyway using old-fashioned police investigation.
We went from Patriot Act to literally disappearing people without due process in only 23 years. Imagine if they could also decrypt your phone and plant evidence in advance.
Even if you trust someone with your life and you know this person is never going to betray you and will always have your best interests at heart, that doesn't mean that they automatically get a free pass to view and inspect everything I do every minute of every day until I die.
Unfortunately, that is what these governments want.
The problem is the average person doesn't care very much or understand it.
If you ask anyone if privacy matters they will of course say yes. If you ask them why they use software with telemetry or websites with Google Analytics they will simply shrug.
If you ask them if it's alright for the NSA to collect and analyze data from everyone they will say yes and they have nothing to hide.
People don't know what privacy is. They don't know what they are fighting for or where the fight is taking place.
If you take that and then add encryption to the mix... and you have politicians and agency plants talking about "saving the children from online pedos" by banning these "encryption apps and technology"....
You nailed the problem. Privacy is the tension between freedom and overwatch. Perfect privacy would yield zero justice, while zero privacy yields big brother/1984 overwatch. A healthy balance must exist for society to thrive.
"Secrecy of correspondence" is a longstanding legal principle in many countries (e.g. in Germany since the unification in 1871, in the US there was a supreme court ruling in 1877)
The only way to guarantee secrecy is through encryption, preferably e2e.
It’s honestly annoying how often experts speak up about this, and still nothing changes. We’re stuck in the same cycle—fear gets in the way, and in the end, it’s our privacy and security that suffer. If anything, this should be a sign to invest in stronger encryption and better law enforcement tactics that don’t mess with the tools keeping us safe online.
Also... we're throwing around words like "crime" and "terror" and talking about shadowy quasi-governmental organizations encroaching on civil rights to privacy. I offer this commentary from the Eurythmics' score to Michael Radford's 1984 film "1984" to serve as background music for our discussions.
There's an abstract argument template that I've noticed floating around. It goes like this:
1. There's a thing T in the world, and that thing has negative outcomes X, Y, Z, and positive outcomes A, B, C.
2. Some people believe that Y and Z are so bad, that they want to partly compromise C to diminish them.
3. However that will never work! And they'll definitely also take B if we let them mess with C.
4. Besides, C is so important, that we should accept Y and Z to have it.
I've heard it many times before. Reading this post feels like watching a rerun of Friends.
Are you saying that this template is what the article is presenting?
If so I don't believe it applies, in particular because you have stated that only a partial compromise on C is needed to prevent Y and Z.
There is no "partial compromise" on encryption, so this argument is flawed. There is no way to have encryption that "only the good guys" can break. It is either secure, or it is not.
I've usually seen it phrased as "let's ban wheels|cars so bank robbers can't escape!".
But well, even that rebuttal is getting tiresome. It's the same people that keep pushing for banning air again and again. They control all the communication channels, so nobody can ever rebut them in a forum that matters, they control the governments, and they are still not popular enough to make that thing pass. Yet, they keep pushing for it.
I don't think we'll solve this by talking about this. We need to talk about systemic corruption instead. (But then, they control the communication channels...)
From my experience this kind of problems are avoided to be solved simply because the difficulty of them is crystal clear and usually there are no champions who are willing to push it to the very end.
Because only the first part adds something to the discussion. Starts with the the problem, then goes about only one of the possible solutions (which usually is the low-hanging one), states why it's bad and ends with refuting the existence of a problem.
And also apply it equally to ecommerce and homebanking.
Lets see how happy the voters are when they have to start walking to their Bank again every week, can't order their latest temu toxic waste product anymore and their GDP drops in half.
The same people who want to make encryption a crime (like Trump 45[0]) are using signal to discuss sensitive information without an audit trail. It's absolutely rules for thee.
Same with Chat Control. LEO and EU Mps would be exempted from being surveilled because their lives and communications need to be private since they are very important but yours, god no!
And people wonder why democracy is out of style. With democrats such as these, you don't need tyrants.
Because SF-dwelling tech bros demand free speech but can perform the necessary mental gymnastics to overlook the right to manufacture and possess technology that has existed for over a century.
As a software engineer who specialized in cryptography in the 1990s and didn't work for the NSA (working for RSADSI, Bell Canada and Certicom) I feel I have an informed vantage point from which to offer notes.
a) This seems like a decent introduction to the subject of cryptographic regulation in the last 30 years. It's far from exhaustive, however. I do appreciate the collected references from diverse points in the last several decades.
b) I would have mentioned "Sink Clipper" and the ACLU "dotRights" campaigns. Neither are especially easy to find in the increasingly enshittified google cache, but Le Monde Diplomatique has this article, complete with a link to Sink Clipper poster (I think from the mind of Kurt Stammberger) that no collection of CypherPunk oriented ephemera from the era can be without: https://mondediplo.com/openpage/selling-your-secrets
c) Herb Lin presented a very nice paper back in the day comparing PROPOSED encryption regulation with ACTUAL encryption regulation. I think the thesis was through the 90s, proposed regulation was increasingly draconian (clipper, etc.) but actual regulation was liberalizing (effective deregulation of open-source tools.) I found Herb's page at Stanford and heartily recommend it if for no other reason than it's sheer volume of written material: https://herblin.stanford.edu/recent-publications/recent-publ...
e) Making the web "secure" or "private" is like putting lipstick on a pig. Modern web technology is designed to de-anonymize and collect identifying information to enable targeted ad delivery. Thought I generally respect Moxie Marlinspike and have no great beef with Signal, there has been a concerted effort to exploit its device sharing protocol and your carrier and national governments can easily extract traffic analysis info from people using it. Were I to add one sentence to this guide, it would be "While these tools are better than nothing, they are far from perfect."
f) The guide seems to conflate encryption with privacy. Encryption technology can enable privacy, but you're not going to get privacy from encryption technology unless you pair it with well reasoned policy (for organizations) and operational guidelines (for both organizations and individuals.)
The extreme example is to say "nothing stops a participant in an encrypted communication from sharing the un-encrypted plaintext after it's recovered." People earnestly trying to maintain message security probably know not to do that, but when talking about exchanging keys and figuring out which keys or organizations you should trust, it's easy for even the well-informed to make privacy-eroding decisions.
So... I think this article is a good jumping off point, covering material I would call "required, but not sufficient." I would just view it as the beginning of a deep-dive instead of the end.
* issued by the device manufacturer or application creator
* to law enforcement
* once a competent court of law has given approval
* that would allow a specific user's content to be decrypted prior to expiry
There are a million gradations of privacy from "completely open" to "e2e encrypted". Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes. Politicians are (mistakenly) asking for a master key - but what I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
Competent jurisdictions allow this for physical search and seizure. It's not unreasonable to ask for the same thing to apply to digital data.
The first thing that's wrong is the principle - we should have a right to try to preserve our privacy. When even trying to hide is a crime, you live under tyranny.
The second thing that's wrong is the practice - despite the "going dark" panic spread by intelligence agencies, we have far, far less privacy than at any prior point in history, and spying on people, even people trying to hide, is much, much easier. So why the hell must we make it even easier still??
I don’t think this particular devil needs more advocacy.
Law enforcement agencies currently have more data about each of us and more sophisticated tools to investigate crimes than at any time in human history.
> Politicians are (mistakenly) asking for a master key - but what I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
The problem with all backdoors is the human element. Master keys will be leaked. A process to gain access to a temporary key is also subject to the human factor. We’ve already seen this happen with telecom processes that are only supposed to be available to law enforcement.
The other issue is one of a legitimately slippery slope. The asymmetric nature of the power dynamic between governments and their citizens makes it even more critical to avoid sliding down that slope.
And finally, in the environment you propose, criminals will just stop using services that are able to provide such services to the government. Criminality will continue while ordinary citizens lose more and more of their rights.
Well that's your view - but these demands aren't going to go away, and what I think is sensible is for us a technical community to consider reasonable alternatives. Every society is a compromise between anarchic freedom and authoritarian tyranny, and this is another discussion about how a (relatively) new set of technologies can fit into that compromise in a way that is acceptable and reasonable.
I acknowledge the problems you raise, but it does seem to me that we have a good set of systems in place in the form of PKI that has a remarkable amount of flexibility.
It's frankly a bit of an article of faith in our community that encryption == unalloyed good and I think we'd be right to think more critically about that position.
To me, this just means that we must remain vigilant. The slow creep towards authoritarianism isn’t going to go away either. The solution is not to look for reasonable ways for authoritarian rules to exist. Continuous harmful pressure must be met with continuous resistance.
> Every society is a compromise between anarchic freedom and authoritarian tyranny
Except not every society is such a compromise. Some are fully under authoritarian control, and serve as a warning for others who are tempted by authoritarian ideas.
> this is another discussion about how a (relatively) new set of technologies can fit into that compromise in a way that is acceptable and reasonable.
Breaking encryption need not inherently be part of that compromise. And until someone can explain how breaking encryption will actually stop the kind of bad actors used to justify such a direction (vs. driving them deeper underground, i.e. even if you outlaw encryption, it’s not as if law breakers will obey such a law), I see no merit in entertaining such a compromise. The crimes being committed are already illegal.
> It's frankly a bit of an article of faith in our community that encryption == unalloyed good
I don’t think most people in our community see it as inherently/perfectly good, but as extremely important and necessary. This is a critical distinction. As with everything, there are harms that come with the good, and such is the nature of all things. The question becomes: are the harms allowed worse than the good that is preserved? And would the new harms of disallowing the status quo be potentially worse than the harms supposedly prevented?
> I think we'd be right to think more critically about that position.
I agree that we need to think critically about this. But clearly we disagree about what one should conclude from such a critical analysis. I’d argue that taking the position that the government needs more power - especially at this moment in history - is the result of not thinking critically enough.
> Except not every society is such a compromise. Some are fully under authoritarian control, and serve as a warning for others who are tempted by authoritarian ideas.
Every society is on a continuum, and so represents some compromise between freedom for the citizen and power for the authorities. No society is perfectly free and no society is entirely authoritarian.
> Breaking encryption need not inherently be part of that compromise. And until someone can explain how breaking encryption will actually stop the kind of bad actors used to justify such a direction (vs. driving them deeper underground, i.e. even if you outlaw encryption, it’s not as if law breakers will obey such a law), I see no merit in entertaining such a compromise. The crimes being committed are already illegal.
Being able to legally access a private citizen's encrypted data in specific situations would help to (at least more rapidly) prosecute certain crimes more successfully. This is, I think, inarguably true. You can decide for yourself if that is worth a compromise. I'm somewhat on the fence.
> I don’t think most people in our community see it as inherently/perfectly good, but as extremely important and necessary. This is a critical distinction. As with everything, there are harms that come with the good, and such is the nature of all things. The question becomes: are the harms allowed worse than the good that is preserved? And would the new harms of disallowing the status quo be potentially worse than the harms supposedly prevented?
I think it's convenient and useful, but I hardly think it's necessary. Society managed to function just fine (although less conveniently) when strong encryption wasn't available for communications. Banking still happened, money still changed hands.
> I agree that we need to think critically about this. But clearly we disagree about what one should conclude from such a critical analysis. I’d argue that taking the position that the government needs more power - especially at this moment in history - is the result of not thinking critically enough.
It depends on how you define power. As society changes and new technologies emerge, maintaining existing government authority in new areas - and working out ways to ensure that authority is maintained - isn't really giving governments more power, but trying to ensure your society remains in the agreed location on the freedom/authority continuum.
If you see this as expanding powers, I can see how you would consider that a problem. But I think this is more about ensuring existing power is maintained correctly over a new area where crime is being committed.
Playing devil's prosecutor, I would say that technology has simultaneously made telecommunication a nearly constant part of life while also enabling mass surveillance on a global scale, and the process hasn't reached an endpoint. The result is an extremely slippery slope from "targeted lawful intercept" to "AI assisted sentiment analysis of every iMessage". Or in the future, everything seen by your AR glasses, every thought encoded by your neuralink chip...
Your limited lawful intercept example is reasonable to most, but as you yourself acknowledged, that's not what politicians are seeking. Therefore even if the community supports and enables "just that", politicians will eventually demand their wildcard cert. It will be a national emergency, after all.
Prior to expiry would suggest the encryption is broken from the start.
Although I do disagree on the reasonable/unreasonable angle, because I don't tend to analogize the contents of your phone to the contents of your safe, but rather to the contents of your mind.
Well I get that a significant part of our lives is wrapped up in our phones nowadays, but I still try to preserve a safe haven between my ears...mostly...
Yes, I am arguing for is requiring OEMs to implement this mechanism.
Frankly, if the NSA wanted to have Apple build a custom iOS version for a criminal so they could sniff his network traffic and flash content from the comfort of Maryland I don't believe that would be impossible today.
If an OEM could decrypt a users data, a government typically won’t bother to do it themselves. They’ll just use legal mechanisms to require the OEM to do the work for them.
Again, as a thought experiment, what legal protections can we put in place - an encryption ombudsman or independent authority - that would allow an arms length, controlled and expiring mechanism that allows limited access to a user's data? What would we as a society be happy to accept? I don't think the demand is an unreasonable one, but I'm trying to figure out what a reasonable collection of mechanisms looks like.
This requires the device manufacturer to have the capability to decrypt the data (to be able to do so when all this process is properly observed)
If they have the capability to decrypt the data, a court can compel them to do so, disregarding the process you suggest. A cyberattack could achieve it without a court order.
Because it is not realistic to except a government to always be "good". Courts are just going to rubber-stamp warrants, like they have done with present-day "lawful interception" warrants. And the keys are inevitably going to leak, if they are used routineously to investigate common crime.
That expiration is impossible to enforce. If you have the data and the cert, you can use it whenever you'd like, and the only thing preventing you from doing so is some piece of software voluntarily choosing to comply.
What that means is, there exists a master key in your scheme.
Users can ignore the expiration date on a TLS cert. Cryptography doesn't enforce time constraints, business logic does.
somewhere a piece of code would have to say "here I've got this key, which can decrypt this text, but I'm not going to" and that decision is not protected by math.
I'm not sure I follow. Obviously the application itself needs to support the business logic described, in the same way as your web browser needs to notice that a certificate has expired and tell you there's a problem with a website you're visiting. What I'm exploring is why requiring certain applications to support the same sort of thing to decrypt user data in certain circumstances to support law enforcement is a problem.
the closest you can get to what you want is a trusted third party who would help derive the final key. so the key could not be revealed to law enforcement without cooperation of the trusted third party who would verify policies like time, etc. it may also be possible to have the 'trusted third party' be a piece of tamper proof hardware. i think generally people are suspicious of these schemes because it relies on 'trust'.
also, i think apple has a scheme similar to this for protecting the passcode from being brute forced when recovering from iCloud backup. however, if this scheme breaks it doesn't reveal the encryption key i believe it just allows the passcode that protects the encryption key to be brute forced which I guess may or may not result in the encryption key being revealed.
The problem is that nefarious actors aren't physically barred from the data. If China, Big Balls, Zuckerberg, or anyone else want to access that data then they can just remove that check.
More importantly, the thing you're asking for (law enforcement retroactively snooping without there existing a master key) is always impossible.
For other forms of snooping (like a warrant to tap communications for a single device for a period of time), you have related issues. Suppose you magically make such a thing flawless -- the client can't detect intrusion, a single key is actually time-bound, etc. There still exists a group of people with the power to hand out such keys, and that power, however it's implemented, is still a master key to all future communications over that protocol.
You can partially mitigate the risk in various ways, but you can't eliminate it. Every proposal for weakening cryptography in that way has had glaring flaws, and many known attempts at actually weakening it have later been cracked by nefarious actors. Spying, but only for the "good guys," should be met with extreme skepticism as far as cryptographic protocols are concerned.
For all of these schemes, what happens when the people holding keys and power are physically forced out (DOGE et al)? Even if we assume the thing is implemented flawlessly, the people involved never leak anything, the master keys stay secret, ..., you still have the human problem of transitions in power. Do you want the current US administration, one currently arguing that it can "deport" actual citizens to torture prisons with no recourse or court case, to know that six years ago your daughter confided to her best friend that she got an abortion once? That she doesn't believe Israel should be committing genocide? Or, suppose you approve of the current administration, what about the next one that takes the reins with this new set of powers? It's bad enough without decades of chat history to let 70%-accurate AI comb through and make deportation decisions.
Yes, but if a court decides you have committed a crime, and law enforcement show sufficient cause to obtain a warrant, they can seize your secret and - if it's relevant - use it to show your state of mind when the crime was committed.
But in the same way that as a society we allow physical privacy (and freedom!) to be removed under certain circumstances, we should consider allowing digital privacy to be removed in the same way. 1984 imagines a world where the authorities can enter your physical space at any time because they feel like it. But I don't lie awake worrying about that because I live in a society where I feel the social contract is largely upheld by the authorities.
I don't accept that. We have "master keys" for some forms of encryption right now in the form of root certificates; knowing that root cert authorities could issue certificates that might allow people to sniff my network traffic doesn't keep me awake at night.
To reduce any risks, almost everything PKI-related is conducted in public, is auditable by anyone, and is a cooperation between dozens of distinct entities located globally.
This is not analogous to a single government having non-transparent, non-auditable access to decrypt communications of its own citizens.
>Then setup the system to be more analogous. Make the publication of key issuance under this mechanism public after a period of time.
That (somewhat, barely) addresses one of ~dozen issues with the proposal.
>Again, I see us falling back into an "all or nothing" view of privacy
Not to be too pedantic, but I think the distinction between privacy and encryption is incredibly important: almost everyone agrees that privacy is a gradient. The disagreement is whether or not encryption can be a gradient. Most people do not think it can reasonably be without undermining ~everything relying on it.
I get the hostility towards it - as I've said elsewhere, it's practically an article of faith in our community that strong encryption == unalloyed good. And clearly it needs a lot of thinking to address potential abuse. But we've done it for other things.
> Not to be too pedantic, but I think the distinction between privacy and encryption is incredibly important: almost everyone agrees that privacy is a gradient. The disagreement is whether or not encryption can be a gradient. Most people do not think it can reasonably be without undermining ~everything relying on it.
That is a fair criticism. I would answer that by saying that encryption is just a technology, and you can employ it in very flexible ways (including e.g. n-of-m style keys) which if thought through well and legislated carefully could give the authorities more reasonable access to data when it is legally warranted.
That sounds like a golden key approach, and the problem is your communication is no longer protected by math, it's only protected by the will of a stranger to be tortured by the government to protect you
The back and forth discussion on cryptography is happening because there just isn't much middle ground. Either someone else can read your messages, or nobody else can. If one person can read them, the government will push on then until they crack.
No, I don't accept that this is the case any more than the root certificate system is a golden key. I'm quite sure that Apple can issue me a certificate that allows me to build a custom version of iOS that can be flashed onto my phone; why doesn't the same thing apply to other things?
Well if decryption is so justified then brute force breaking that takes significant resources so it's hard to unnoticeably misuse would be a good approach. When you can only break into 100 phones a year, then there's no slippery slope or fascist governments that could wildly misuse it for their own gain because it's not physically viable.
Ok! That is a sensible rejoinder. Make a proof-of-work system that limits the authority from making more than n requests in a space of time. A good brake on abuse.
> Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes.
For starter I don't know a lot of good governments. So you'll have to define how you differentiate between a good one and a bad one.
> Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes.
Secondly, criminals use public transport and roads built with taxpayer money to commit crime. Some even say that they breathe the same air as us honest citizens.
They also live in homes with 4 walls that you can't see through either.
I am being facetious but you can see where I am going with this.
If you think that the governments will stop at spying on criminals once this backdoor is in place, then I have a bridge to sell you.
Do you want your kids to grow in world were everything they do online will be analyzed, categorized and reviewed by some random government employee somewhere?
What if this government turns bad in the future as it has happened countless times in the past? What do you do then?
> I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
The problem with this line of thinking is that it doesn't hold up in the real world. Once you grant access to something like say your browser history to the government or any entity, what's to stop them to ask for more next time?
It's not a big deal right, they can say, well you gave us access to A, now we want access to B. Then in 3 years they will come back demanding access to C, D and E until your entire privacy has been taken away from you.
And every time, they will use the same excuses, fighting crime, fighting drugs, child grooming and terrorism.
> Competent jurisdictions allow this for physical search and seizure.
That is not even remotely comparable.
In those cases, you need a judge or someone to approve the seizure. With a backdoor that can be opened at any time, you should consider that nothing will be private because there is no one who is going to be monitoring it 24/7 to make sure that there are no abuses.
> In those cases, you need a judge or someone to approve the seizure. With a backdoor that can be opened at any time, you should consider that nothing will be private because there is no one who is going to be monitoring it 24/7 to make sure that there are no abuses.
I'm not sure you've read what I wrote correctly. My hypothetical system would not allow the backdoor to be opened at any time, but it would require a certificate to be issued (derived from the manufacturer / application creator's root) that gives limited, expiring access on the production of court-authorised warrant, in exactly the same way a judge gives the police permission to enter your physical property.
Is Indian government a good one, or Hungary's, or Turkish, German, or British, or the US? In the last case (well, in all cases), does "goodness" of a government depend on the current incumbent? What if a previously "good" government turns into an atrocious one?
See also: the detailed Dutch census, which was mostly harmless, until it fell into hands of the Nazis in 1940 and helped them to identify and exterminate almost all Jews in the country.
Every system of authority carries with it the risk of abuse; but we still accept legitimate authorities carrying out breaches of personal privacy for the sake of law enforcement - the warrant system being the obvious one. That's part of the compromise we make in society.
Good governments ensure that a breach of personal privacy has to travel through a legitimate process with an independent judiciary to limit the risk.
Do you think that this can be done without introducing massive security weaknesses into systems that cannot have them?
Also, there is a question if you believe the authorities that without decrypting data, they can't investigate crimes.
Imagine an analogical assertion that without torturing suspects, law enforcement is stymied. Someone might assert that, but we still say no, for all sorts of fundamental reasons. Same with American Miranda rights and others.
Myself, I don't believe in that assertion at all. Most crimes leave a massive real world trace that cannot be encrypted. The ones that don't, maybe should not be crimes in the first place.
> Do you think that this can be done without introducing massive security weaknesses into systems that cannot have them?
Yes, I do - or rather, that is the point of the discussion. We currently allow central authorities to indicate our permission to do or be something in the root certificate system. Why can't something similar be designed to allow controlled decryption?
> Also, there is a question if you believe the authorities that without decrypting data, they can't investigate crimes.
Clearly there are circumstances in which being able to decrypt the data of a criminal would assist in prosecuting crime. See EncroChat for an example of how this has worked.
> Imagine an analogical assertion that without torturing suspects, law enforcement is stymied. Someone might assert that, but we still say no, for all sorts of fundamental reasons. Same with American Miranda rights and others.
Yes. Clearly there are reasonable limits that need to be applied before we can allow controlled decryption of data. I am not arguing for issuance of a master key. See my original post.
> Myself, I don't believe in that assertion at all. Most crimes leave a massive real world trace that cannot be encrypted. The ones that don't, maybe should not be crimes in the first place.
Some do, and some don't. Things like e.g. cryptocurrency heists have profound effects, and are propping up North Korea. Those are definitely crimes...
> It's not a crime to lock your home's door for protection, why would it be a crime to lock your digital door?
A locked home's door is still trivially opened. You can pick the lock or even apply simple brute force, neither of which all that difficult, and open happily it will. Similarly, I don't suppose anyone would be concerned about you using rot13 encryption. If a home could be sealed to the same degree as strong encryption, absolutely it would be a crime, for better or worse.
Which high security vault can the government not gain access to under any circumstances? I expect you'll find decent explosives or a bulldozer will get them in just fine.
So will a hardware backdoor planted by your maid, or a telescopic lens pointed at your screen, or laser microphone on your window, get them into your e2e encrypted chats.
If you intended to reply to a different thread and accidentally ended up here instead, there is truth to what you say, but it has nothing to do with this one.
As it pertains to this thread, where the sole key holder is dead and took the knowledge with him, how do you anticipate to carry out gaining access to the data using live attacks? There are plenty of reasons why the government wants access to data even where prosecution isn't necessary.
I said appeal to emotion. There is no logical foundation for the question. It can only be asked in the context of how one arbitrarily feels about the subject, serving no purpose, and derailing was taking place for no good reason. How you one feels about the subject has no impact on what we are talking about and trying to draw a link between them is completely nonsensical.
The issue isn't "gain access to" - it's "gain access to without destroying the contents."
Explosives and bulldozers are likely to harm whatever was motivating the entry in the first place. The vault system can be engineered to ensure this conclusion, as well.
The question specifically asked which one(s) you are talking about, not about your dreams. Which high security value is the government not able to gain access to?
And, sure, if enough perfectly engineered vaults were impeding the government from carrying out the activities it wants to carry out, there would be calls to make building/using such a vault illegal too. In the real world, such vaults, if they exist at all, don't meaningfully get in the way. Thus there is no reason to think about it. We don't create laws on what theoretically might be a problem in some magical imagined world. We only create laws after something is identified as an actual problem.
> The question specifically asked which one you are talking about, not about your dreams. Which one is like that?
After your five ninja edits, it's been hard to keep up:
Glass relocker mechanisms have existed (in reality) on safe doors for decades and will often result in the destruction of contents if triggered and opening is still required.
Governments are normally seeking evidence: a stack of cash or a quantity of bulk substances are substantially harder to rig to destroy (obviating evidence gathering) than documents or data.
> After your five ninja edits, it's been hard to keep up
No need to reply within the first second. Take your time.
In fact, consider taking a lot more time as you still haven't named the specific vault, or set of vaults, that is causing such a big problem for the government. If we don't know what vault it is, even if your description is vague, how would anyone come to think of it as a problem? Laws are not created by some all-knowing deity. It is just people.
That such a vault might be theoretically possible to build is irrelevant.
Vaults and safes are boutique products. Glass relockers have been sold for decades - can you not extrapolate that heat and impact might destroy something inside of a highly thermally-conductive container?
HSMs and similar tech have had tamper detection systems for decades with internal battery backups.. these aren't illegal yet. My server cases from 20 years ago had tamper switches for exactly this purpose. How hard is this stuff to engineer?
Let me ask again: Which vault(s) are currently, or at least in recent enough memory for anyone to recall, causing great strife for the government? Even a rough location would be sufficient. We can offer that in the case of encryption. There are countless news articles about police not being able to decrypt data they deem important.
Without that, it doesn't matter. Laws are not created based on imagined situations that you can dream up. They only are created after something has become a problem. You can use a perfectly impenetrable vault all day long and as long as the government doesn't want in, it is never going to care.
Of course, the greater subject is really about houses, not vaults. The government has good reason to want to get into your house. For example, you might perish in it, and it needs to get in to deal with your mess. This is a relatively frequent task placed upon government to carry out. If you've made your house impenetrable, government isn't going to remain amused for long. If the government starts encountering that problem often, absolutely it would become illegal.
It is not illegal today because it has never posed a real problem.
That analogy breaks because a home's locked door as the constaint that it can effectively only be visited by someone coming to that door physically. On the internet, multiple crimimals can attack all doors at all times
Scalability is the crux of why encryptions must not be infringed.
The claim that LEOs need to break encryption is based on laziness: they want to easily obtain access to communication, and at scale. They've always been able to obtain communication the hard way, and one-at-a-time - encryption doesn't change that.
> A warehouse with shutters and bulky padlocks, a night security guard and camera system is a crime?
No, why would it be? The security guard isn't going to wage war with the police/military when they want in. The guard will politely comply to any legitimate (and probably even illegitimate) request for access.
> A bank vault is a crime? Safety deposit boxes?
Banks are heavily regulated by the government. They especially aren't going to impede access if push comes to shove.
Laws aren't created on purely theoretical grounds. They are created only when a problem that needs to be solved is identified. The government has never had much trouble accessing physical spaces when they feel a need to. They have had trouble accessing encrypted data.
> is thinking (without transcribing for the government) illegal.
Thinking without a willingness to share what you thought with the government when it feels it needs to know (e.g. in court) is illegal. Full transcription is not always legally required, but it is in some specific contexts where there have been problems getting proper disclosure. Again, laws are created to deal with actual problems, not imagined problems.
I'll note that encryption isn't illegal today. While there are some outlier cases where it has been a challenge to government, it hasn't become a big enough problem to do anything about yet. But if it reaches the point where it is deemed sufficiently problematic, it will become illegal in some kind of fashion. What that looks like is obviously to be seen. It won't necessarily be a blanket ban on all encryption, or even a ban on encryption at all, but most people are not capable of imagining anything else, so here we are.
I'm not surprised, UK became literal African/Middle-East hell hole. They've kicked out all working immigrants and replaced them with ultra-religious freaks.
And of course, UK being a country, where every form of self-defense is the most serious crime, when attacked you must call police, then lay on the ground and die, is cherry on top.
It's pretty universally known that where you live could now be considered a microcosm relative to the rest of the country. Although, I wouldn't necessarily have used "hell hole" to describe the rest.
I do not like these framings of "not a" because it always sounds so suspicious like "we are not a cult".
It puts the idea into the world that it could be a crime and maybe that it is the status quo.
Much better IMHO is something like "Encryption is a fundamental right.", "Encryption protects everyone.", "Without encryption there is no democracy." and so on.
Maybe "Don’t let them take your right to privacy."
It's also, unfortunately, not literally/universally true. There are plenty of jurisdictions and contexts in which it is a crime.
A friend of mine had to swear on a holy book to not use VPN upon returning to their country of origin.
This would make me nervous, but also optimistic that it's difficult to detect? Otherwise why use hell as a deterrent?
Which were you thinking of?
I can imagine Iran has some effort to discourage use of VPNs, though of course everyone does.
I thought China simply made it easy to stay within the Great Firewall, and moderately difficult to get out.
There's a decent overview here: https://www.gp-digital.org/world-map-of-encryption/
I don’t understand this. If you live in the US and use a service like ProtonMail, has a crime been committed? Are there any examples here in the US or anywhere else of arrests/prosecutions being made over encryption? I’ve never heard of any??
He's probably talking about authoritarian regimes like China or whatever.
[dead]
I don't live in the US
There are places that are not the United States.
Also, I've heard it said that people have a tendency to subconsciously flush "not" and remember that sort of statement as "encryption is a crime". It is slightly better to put things positively (eg, "Encryption is the reasonable default").
It makes sense in this context, as it operates on the idea that it could be a crime: "Contrary to what some policymakers seem to believe, whether naively or maliciously, encryption is not a crime."
The "is a right" hasn't worked in years. I'm a lot of areas. . I rather agree with the more specific and up to date "is not a crime".
Claiming a right is not the same as exercising the right, though typically the claim precedes the exercise. Also, are you suggesting you don't have the right to use encrypted communications? (assuming you're not in France, where the "right" to encrypted communications has never been explicitly recognized.)
I generally agree. My first thought was that if I encrypt your data without your permission, that would be a crime (eg ransomware).
I think you also have to delete the original information before it becomes a crime. Though monkeying around on someone elses' machine(s) without permission is criminal behaviour in most jurisdictions I've looked at.
"Speech is not a crime"
Not all speech is non-criminal, at least in the US. Inciting speech has always been subject to prior-restraint and fraudulent commercial speech has always subjected the speaker to legal peril.
I wish Americans still believed in American freedoms
Encryption is free association and free speech. Talking to someone about what I like without eavesdroppers
Transitioning gender is also free speech, freedom of expression. Presenting how I like and not how some wannabe king wants me to
"I wish Americans still believed in American freedoms"
Yeah, as someone who's viewed America from the outside for decades tragically it's no longer the country I once knew.
I think this is a perception formed by media biases. Pretty much any right or freedom evaluated on an individual basis will show that rights and freedoms have expanded (at least up to a few months ago). Many of the negative things being done today have been done in one form or another for a generation or two. I'm not saying that they're right or shouldn't change, just that the perspective of eroding freedoms or right is generally not true outside of business regulations.
No doubt 'perception formed by media biases' has played a significant part and it's been exacerbated by modern communications and social media etc. but I'd contend there's more to it than just that.
What I didn't mention was that I've been to the US many times and I've relatives there, and I've even worked there and these factors have also influenced my perception.
Let me put it this way, if the Greatest Generation, aka the G.I. Generation were to come back today and saw what has happened they'd not only be dismayed but horrified. Right, much of that reaction is to be expected with intergenerational change, etc. but again I'd suggest it's more than that.
It's not possible for me to even begin to justify what I've said as even a précis would take me many pages. Instead, I'd refer you to journalist Tom Brokaw's 1998 book The Greatest Generation wherein he describes the values and beliefs of the people of this generation as well as the ethos of the era in which they lived. Far be it for me to tell American society what it ought to be doing but I'm of the opinion it wouldn't be a bad idea if all Americans read this book—after all, it's actually Brokaw who's making the suggestion that his countrymen read the book or he wouldn't have written it.
In short, Brokaw wrote this book because he sensed the same change in US society as I had done and no doubt much more acutely so. I'll now extrapolate: it's now over quarter century since he wrote it and I'd contend the contrast to which he referred is now even more extreme.
https://en.m.wikipedia.org/wiki/The_Greatest_Generation_(boo...
BTW, just don't take my word, I'd suggest you search out some of the book's reviews.
Incidentally, when I was working in NY some decades ago I shared my office with a GI of that generation and he became a great friend. I had many discussions with him about his past experiences. I consider it great privilege to have known him (his name would be familiar to some of you).
The big issue right now is that they can't seem to agree on what those freedoms even mean.
It probably was never the country you once knew. The view of "american freedoms" is radically different depending on the viewer's position in the socioeconomic hierarchy.
Prehaps not. See my reply to giantg2.
> I wish Americans still believed in American freedoms
I wish people understood the American system at a philosophical level. What you call "American freedoms" are largely based off of negative rights, i.e. John Locke. Our bill of rights use specific language like "Congress shall make no law", "shall not be infringed", "shall not be violated". It's inherently freedom from state action.
Over the past 100 years a different interpretation of rights has emerged, so called positive rights as exemplified in FDRs second bill of rights; e.g. "the right to a good education" or "the right to earn enough to provide adequate food and clothing and recreation". This requires state action to facilitate freedoms for its citizens.
Unfortunately these systems are incompatible. I think a lot of the friction we are seeing in modern times can partially be traced to this contradiction.
"Unfortunately these systems are incompatible. I think a lot of the friction we are seeing in modern times can partially be traced to this contradiction."
I'm pretty certain you're correct but I won't attempt to justify it detail here as we have to bring out the political philosophy texts on mass.
In the light of the English Civil War many thought about politics and freedoms Locke being one, his contemporary [almost] Thomas Hobbes with a different position—the Leviathan. Rights, freedoms and social contract theory was still raging nearly a century later with Rousseau whingeing about man being born free but everywhere he's in chains—opening line of the Social Contract. And there's still no universally agreed consensus.
Over the centuries political philosophy has covered almost every conceivable interpretation/position about the rights and powers of the State versus individual freedoms, so it's not for the want of options/choices. Dichotomies still remain because the citizenry is composed of people with wide range of political beliefs many of which are incompatible (this has always been the situation).
We shouldn't expect a consensus.
I'm not sure the US population ever really believed in fundamental freedoms.
They had an apartheid up to 60 years ago. There are living people from that time, and you can't believe in any human right and have an apartheid at the same time.
People believe in logical inconsistencies all of the time, it’s practically the default. Also there is no such thing as perfect freedom, it’s best thought of as an optimization problem with many dimensions.
As an example, the civil rights act necessarily curtails the freedom of association.
"They had an apartheid up to 60 years ago."
For many of us outside the US there's a dichotomy here. The North won the bitterly contested Civil War and freed the slaves but never really afforded them true freedoms. Why?
The perception from the outside is that conscience over slavery per se drove the North to war and not concern for the fact that slaves were actually people who were suffering enslavement and or unfairly treated.
Edit: Given the Civil War why was the Civil Rights Movement 100 years later necessary?
> Transitioning gender is also free speech, freedom of expression.
Is a legal requirement for others to affirm this expression also "free speech?"
Has there been a single instance where someone faced legal repercussions for not affirming someone's gender?
Legislation is currently in the works (and likely to pass): https://leg.colorado.gov/bills/hb25-1312
"Lose custody of your child" is very much a "legal repercussion."
You are suggesting that family courts in Colorado should be barred from hearing info from psychologists about the impact of dead-naming?
> You are suggesting that family courts in Colorado should be barred from hearing info from psychologists about the impact of dead-naming?
Classic strawman argument. Where was anything like this suggested? Are they barred today under existing legislation?
The eventuality of this line of reasoning is that: "special interests who control the DSM (or whatever standards body governs these soft sciences) can influence and determine the outcome of custody battles."
DSM-4 defined "gender-identity disorder" as a thing, that's now been de-pathologized to "gender dysphoria."
Under your framework, a body of unelected, politically and financially-motivated "experts" can now determine the imposition of legal consequences on a whim.
There was a question mark at the end of the sentence, indicating a question.
"Are you suggesting that family courts in Colorado should be barred from hearing info from psychologists about the impact of dead-naming?"
It is a simple question. I think it tells more about your viewpoint than you may think that you consider discussions of trans issues "a strawman."
I also appreciate how you have decided that you know my thoughts on a complex subject simply by me asking you to provide more detail as to what you were saying.
It's entirely possible I'm not attacking you. It's possible I don't understand what you're saying.
This is too many words to convince someone who already doesn’t believe this.
Put more simply: the modern internet doesn’t work without encryption, it is a fundamental part of the technology. Without it, anyone could log into any of your accounts, take your money, messages, photos, anything.
>Put more simply: the modern internet doesn’t work without encryption, it is a fundamental part of the technology. Without it, anyone could log into any of your accounts, take your money, messages, photos, anything.
I'm pretty pro encryption, but even this is pretty dishonest. Phones (ie. PSTN, not iPhones) aren't "encrypted" by any means, but there's plenty of sensitive information sent over it. Lawyers fax each other important documents, and doctors fax each other medical recorcds. There was (is?) even telephone banking where you could do basic transactions over the phone. Even today, some banks/brokerages require you to phone in to do certain high risk operations (eg. high value transfers or account resets). All of this happens without encryption. While that's less security that I'd like, it's safe to say that "anyone could log into any of your accounts, take your money, messages, photos, anything" isn't true either.
I’m not saying every layer of the onion is individually encrypted. But there are plenty of layers that are.
There is plenty of encryption used when you send any sort of message from an iPhone, even SMS. You can’t even turn the dang thing on and unlock it without encryption. Then when you send it, it’ll be encrypted by the radio before transmission. Then in transit it may or may not be encrypted at various points.
And POTS is not the internet.
My overall point is that encryption is used all of the time when people use the internet for routine tasks that they expect to work, and would not work in a modern reasonable way without it.
People use these technical implementations details to muddy the water of this conversation and demonize encryption, when the reality is that everyone uses it literally all the time for almost everything.
>There is plenty of encryption used when you send any sort of message from an iPhone, even SMS. You can’t even turn the dang thing on and unlock it without encryption. Then when you send it, it’ll be encrypted by the radio before transmission. Then in transit it may or may not be encrypted at various points.
If your argument for encryption is "we need encryption because if it's banned overnight all our phones will turn into bricks!", then yeah sure I guess it's true. But even the diehard encryption opponents aren't arguing for this. My point is that you can very much have no encryption, but not "anyone could log into any of your accounts, take your money ...".
Colloquially, there is a perception among some that encryption is a thing that only the military, criminals, etc use.
Many people are unaware that they use it in everyday life.
If you listen to discussions on this topic outside of technical forums, this perception is not uncommon. It’s important to be clear to laypeople about the ubiquity of encryption, because they are the majority of voters.
Another aspect is traditionally the administrative burden for state actors to receive permission to eavesdrop on POTS technology is relatively high. Or at least it was before the Patriot Act. I would argue it is still higher than eavesdropping on modern digital communications (IPCMS, Email, web browsing, etc.)
Allowance for using faxes to send protected health information (PHI) as defined under HIPAA was essentially grandfathered in for practical reasons, not because it is at all a secure enough communications system for sensitive data. If faxing medical records had been banned then the healthcare system would have come to a halt, which would have been worse then the privacy risk. But if fax was invented as something new today it would never be allowed for PHI.
It's only recently that more secure alternatives to faxing have become practical, like DirectTrust Secure Direct Messaging.
1. How often are people saying their bank login on their phone calls?
2. Is there a way for phone call man in the middlers to get that info without wasting a ton of time listening to calls? With internet MITM it is very easy to set up a program that scrapes unencrypted login info.
>1. How often are people saying their bank login on their phone calls?
Have you ever called into a bank or brokerage? Most ask "security questions", often ones that you can't even choose, like your address or how many accounts you have with them. It's arguably far worse than speaking your password into the phone.
>2. Is there a way for phone call man in the middlers to get that info without wasting a ton of time listening to calls?
Automated speech recognition has been around for decades. Even before that signals intelligence agencies have shown that widespread wiretapping/eavesdropping is possible and effective.
Its about threat levels I guess. I was man in the middling passwords in high school without knowing much of anything about technology. Setting up speech recognition alone is a task that most people are incapable of doing. If youre worried about the type of people who can set that up you probably shouldnt be giving that info out on a phone call.
“Setting up speech recognition alone is a task that most people are incapable of doing”
If you were MITM is HS- your modern day equivalent is way stronger than you think. Easy for kids to clone voices and deepfake these days. Anybody can ask any one of the free chatbots out there for a step by step guide to implement this- they will even write the Python script for you, tell you what IDE to download and how to run it out of the terminal.
>Put more simply: the modern internet doesn’t work without encryption
being pandantic that should read - the modern usage of the internet..
the internet does work ok without encryption, has it has done from a long time ago
That’s exactly the pedantry that muddies the water and confuses people on this issue. Colloquially, it is a distinction without a difference. The internet as normal people know it does not work without encryption.
I do agree, it depends on the context, eg talking to my family vs this forum
This site is not full of "normal people" and it shouldn't confuse people/ muddy the water if being dicussed here
I agree, but I am saying this here because the OP is written for a nontechnical audience.
End-to-encryption is a good thing, and so is this website providing information about how to use it.
But this particular article represents a particular pathology surrounding freedom. Freedom is supposed to be about doing what you want. It's not about making florid speeches about how free you supposedly are. If you want to use end-to-end encryption, just use it, and maybe offer advice to others on how to use it.
There are some politicians who have decided that only bad people use encryption. Going up to one of these politicians and trying to explain that you use encryption but you're actually a good person won't convince them that encryption's okay, it'll just convince them that you're a bad person. Politics is one of those things that attracts people who just want to find the shortest route to a decision about who are the good people and who are the bad people, and keeping secrets isn't something that those sorts of people like other people doing.
Unless you have evidence that the government is rounding up people just for using encryption, all this sort of advocacy does is to draw attention to you having something to hide, and therefore probably being some sort of wrong'un. If the government is rounding up people for using encryption, that's a specific threat you need to respond to, and starting a public campaign is not the right response.
Something is a crime if society determines that it should be so. Nothing more.
Clearly the pressure on government to write these laws is coming from somewhere. You should engage with the arguments the other side makes.
>You should engage with the arguments the other side makes.
The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
Let's not ignore the full history here. That is a bad faith argument. It was a crime to use expensive encryption 30 years ago, but a lot of decisions were made to allow it. Today, every single one of those old caveats about child porn, drugs, money laundering, terrorism, (both domestic and international) and criminal acts in general all have stories where weaker encryption would have saved hundreds and hundreds of lives. We have to recognize this or we're just arguing past each other.
https://fedsoc.org/commentary/publications/encryption-techno...
>where weaker encryption would have saved hundreds and hundreds of lives.
Can you do the same thing, but in the other direction? How many people would have been harmed if weaker/no encryption was the standard?
Let's go with the USPS. They have been sending daily communications where unencrypted info is the standard since 1775.
I'm not sure how this is answers my question at all.
How many whistleblowers would have been killed without a secure way to blow the whistle? How many journalists and journalist sources would have been killed? Etc. These people aren't using the USPS for good reason.
Point being, you are only doing one side of your calculation and presenting it as a full argument. But it's just a bad argument unless you calculate both sides.
That's a different question. You asked how many people would have been harmed if weaker/no encryption was the standard. The USPS is a message system where federal employees are able to intercept suspicious content, and there is no built-in encryption for mail. Voting by mail is a great example of how a critical message can be sent without relying on encryption. Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
I don't really want to hash this same thing out for the... At least hundredth time. You're not going to convince me, I'm not going to convince you, and we'll both just leave less happy if we keep going.
>Whistleblowers can still encrypt documents on a flash drive, and drop it into a mailbox. There is nothing stopping them from doing so.
The only thing I want to highlight for your consideration is that the USA is not the entire world. The USPS, even if it were perfect, does not exist in the overwhelming majority of the world. People talk to people across borders.
(Also, with some of the proposed laws, encrypting the USB would be illegal)
And no service offering encryption has existed since 1755? Because that is required for your argument. Otherwise you simply send unimportant stuff via USPS and sensitive/secret/important stuff via non-USPS.
My vote, my taxes, my REAL-ID driver license, passport, credit cards, phone SIM, checks, 401k statements, etc. have very recently been sent via USPS. Do you consider this unimportant stuff?
You apparently do.
A bit of a nit-pick. 30 years ago was 1995. It was not a crime to use PGP in the US in 1995. What PKZ was charged with was exporting the encryption technology (or allowing it to export itself by putting the code on an anonymous FTP server.) The Bernstein case was similar in that it was the export of the machine-readable code the government objected to, not it's domestic distribution. The right for researchers to publish text describing algorithms had earlier been recognized by the court (which is why Martin Gardner could describe the RSA cryptosystem in Scientific American in 1977.)
>> You should engage with the arguments the other side makes.
> The arguments are "Protect the children.", "Catch terrorists.", "Catch criminals.".
> Those arguments have been engaged with for decades. They are purely emotional arguments. Anyone who still pushes those arguments forth is most likely doing so with ulterior motives and cannot be reasonably "engaged" with.
Oh come on. Why do you think a "purely emotional arguments" are illegitimate? Are you some galaxy brain, coldly observing humanity from some ivory tower constructed of pure software?
Nearly all positions people take are, at their core, "emotional." And the disagreements that result in "arguments" are often really about differing values and priorities. You might value your "freedom" more than anything and are willing to tolerate a lot of bad stuff to preserve strong encryption, some other guy might be so bothered by child sexual abuse that he wants to give it no encrypted corner to hide in. You're both being emotional.
Those are both reasoned arguments. The emotional argument would be "some guy is so bothered by sexual abuse he wants to ban lightbulbs because once he heard about a lightbulb in the context of an abuse". The "solution" is not really a solution, but the emotional person does not really care about solutions, he's too emotional to think straight.
At least that is how I see the word used.
> Those are both reasoned arguments. The emotional argument...
Rationality and emotionality are not mutually exclusive, and I would say there are very, very few arguments that are devoid of emotion.
The the GP was using "emotional" to dismiss the kind of arguments you're saying are reasoned.
>The the GP was using "emotional" to dismiss the kind of arguments
I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening. Such as the three examples I gave, which are, in almost every case, used to win an argument without having to consider any possible nuance of the situation.
Often, it's a completely thought-stopping appeal, because everything is simply countered with "so you don't care about children". Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
Apparently that makes me galaxy-brained or whatever, though. ¯\_(ツ)_/¯.
> I'm dismissing arguments that are designed to appeal to (and manipulate) the emotions of the person listening.
My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
> Or, in your case, subtlety alluding to me being tolerant of CSAM (which was wildly inappropriate, albeit a great example of why I generally just don't talk to people who use those types of arguments).
That's not what I was doing. I was giving an example to show it's a trade-off driven by priorities and values. But if you want to be super-logical about it, supporting strong privacy-preserving uses of encryption necessarily implies a certain level of tolerance for CSAM, unless you support other draconian measures that are incompatible with privacy. Privacy can be used for good and bad.
>My point is that's pretty much all arguments, except maybe some very obtuse ones no one really cares about.
There is a distinct difference between a person having emotions while arguing, and using an appeal to emotion as a rhetorical tactic. I do not agree that "pretty much all arguments" contain an appeal to emotion (again, as a purposeful fallacious rhetorical tactic), even though all arguments obviously will have people feeling some sort of emotion.
Even looking through this entire thread, most of the disagreements here do not contain appeals to emotions.
I'm sure that any book on logic and rhetoric from the last few centuries would explain it better than I can. The wiki page has some good explanations and examples as well.
>Are you some galaxy brain, coldly observing humanity from some ivory tower constructed of pure software?
I just think arguments based on appeals to emotion are very often fallacious. But sure, I guess that means I'm a... whatever you just said.
For what its worth the anti-encryption/anti-privacy laws have caught terrorists in the UK. My company provides data storage for their dragnet and handles various requests and Ive seen first hand 4 different instances where the UK gov watching everyones internet activity led to terrorists being caught.
> anti-encryption/anti-privacy laws have caught terrorists
This is undoubtedly so; but much turns on the trust in government. In this U.S., the president, himself a documented profligate liar, just invited an equally untrustworthy unelected person into the halls of government to vacuum up whatever data he pleased. Maybe trust in the UK government is higher.
There was LITERALLY intelligence in the president's daily briefing entitled "Bin Laden Determined to Strike in US."
https://nsarchive2.gwu.edu/NSAEBB/NSAEBB116/index.htm
Collecting data is often not the problem. The problem is how to evaluate it and use it to direct the use of finite law enforcement or counterintelligence resources.
But to your point, let's not forget congressional republicans rushing a SKIF on capitol hill with their mobile devices out in clear violation of policy (and common sense.) I am relieved by the fact that Trump and Musk do not seem to understand what they can use sensitive information for (other than perhaps to sell or give away to foreign governments and businesses.)
I think my point is good intelligence comes from stitching together numerous data points and often traffic analysis is as good (or better) than content analysis. And maybe that the overwhelming majority of elected officials have no conception of how intelligence is collected and evaluated.
Low hanging fruit. The smart ones likely aren't being caught now.
Moreover, it's only a matter of time until the criminal fraternity all catch up and are on the same wavelength. That's when all but the dumbest know exactly what not to do or say on the net.
The Internet is still comparatively young and like everyone else those who've evil intent are still learning exactly how it works. I'd bet money that it won't be long before a 'bestseller tome' of definitive what-not-to-dos cirulates amongst this mob.
The question is at what level will law enforcement's catch have to fall before it has to turn to other methods.
This number by itself means nothing as the other variables are unknown.
How many terrorists were not caught by these systems? How many would have actually done these actions instead of just talking about it? How many could have been caught with just standard police work?
Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I wouldnt go as far as saying it means nothing, but I agree that the story certainly isnt simple. Was just pointing out that "catch terrorists" isnt a purely emotional argument. Would the terrorists be caught anyway? We'll never know, but theres no way you can say they would for certain. Personally I dont think catching a few terrorists is worth giving up privacy but other people disagree.
> Without knowing these variables then there is no way to say if these systems are particularly good at catching terrorists.
I dont think we can ever figure this out since no one is willing to run an rct when it comes to counter terrorism
That same government can use that same dragnet in the suppression of accountability for the war crimes and atrocities it is engaged in.
> Clearly the pressure on government to write these laws is coming from somewhere
Software surveillance vendors.
> Chat control: EU Ombudsman criticises revolving door between Europol and chat control tech lobbyist Thorn
> Breyer welcomes the outcome: “When a former Europol employee sells their internal knowledge and contacts for the purpose of lobbying personally known EU Commission staff, this is exactly what must be prevented. Since the revelation of ‘Chatcontrol-Gate,’ we know that the EU’s chat control proposal is ultimately a product of lobbying by an international surveillance-industrial complex. To ensure this never happens again, the surveillance lobbying swamp must be drained.”
https://www.patrick-breyer.de/en/chat-control-eu-ombudsman-c...
The problem is LEOs (and associated industry) claiming that enforcement is impossible without the ability to obtain cleartext.
This is a lie: obtaining cleartext just makes enforcement vastly easier and more scalable. If crims have encrypted mobile phones, you can still point a microphone at them.
Scalability is the big issue.
Honestly, I had always assumed LEO wanted access to decrypted message content so they could sell it to advertisers. I mean sure, you could catch a criminal or two, but with all that non-criminal data, just imagine how much off-the-books revenue you could accrue by selling it to the AdWords guys.
The other side being, for instance, the surveillance lobby that pushes for chat control laws in the EU? The "arguments the other side makes" are pretty clear at this point, and nothing to do with the "think about the kids" really, not sure engaging with them is the point.
> Something is a crime if society determines that it should be so. Nothing more.
According to The New Oxford Companion to Law, the term crime does not, in modern criminal law, have any simple and universally accepted definition.
Society also determined it was ok to use a firehose on black people, so I think the best we can say is that the term Crime has nothing to do with Morality, and people who conflate the two need to be looked at with suspicion.
> You should engage with the arguments the other side makes.
I don't. I think most arguments about crime require one-side to act in bad-faith. After all: The author doesn't actually mean that Encryption isn't illegal in some jurisdictions, they mean that it shouldn't be. You know this. I know this. And yet you really think someone needs your tautological definition of crime? I don't believe you.
The arguments are mostly that they dislike what can be accomplished via math. “The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia” isn't exactly an 'argument' so much as an insistence.
The article does address the flaws in some of their arguments (encryption inconveniences law enforcement, think of the children) by pointing out that the average person and children are kept save from criminal elements by encryption.
You can make gun fairly easily with what can be accomplished with a CNC machine. It is still illegal.
Where that is illegal they don't go making CNC machines illegal because of that.
Legislators are literally trying to restrict sales of machines that can be used to build firearms. I don't agree with this, but it's happening.
https://www.nysenate.gov/legislation/bills/2025/A2228
They're not making math illegal.
> It is still illegal.
Not in the vast majority of the United States.
The arguments from the other side are of the "think of the children" and "tough on crime" variety. They are purely emotional and if you try to dispute them they just respond with "so you don't care about children?". It's like trying to argue with a religious person on matters of faith, you're just not very likely to convince them.
*edited to add "on matters of faith"
"Think of the children" is used so often when talking about LGBT issues, often not thinking at all about the LGBT children
"Think of the children" really means "Think of my children". Nobody gives a shit about someone else's children.
Kind of impossible when they meet In secret courts and have privileged access to Congress.
If we had trustworthy governments, or trustworthy police agencies, then maybe mandated backdoors wouldn't be all that bad. But if anything, recent events that clearly demonstrated that governments are not trustworthy, even if one is trustworthy today it couldn't become an evil regime tomorrow, and handing all your power over literally anything to such an organization does not seem wise.
It doesn't seem like trustworthy governments is the issue. You can't have backdoors period because they'll be leaked / discovered and used by bad actors.
https://www.youtube.com/watch?v=VPBH1eW28mo
That too. But even if the government was perfect and trustworthy and free of leaks, that can still all go out the window as soon as a less trustworthy government is elected.
I have yet to see a case against someone that hinged on some data that was encrypted. Almost every tale from some cell needing to be cracked has ended in a fart because they got the information anyway using old-fashioned police investigation.
We went from Patriot Act to literally disappearing people without due process in only 23 years. Imagine if they could also decrypt your phone and plant evidence in advance.
Here's one:
https://www.telegraph.co.uk/technology/2021/05/23/meet-man-b...
I am against it as a matter of principle.
Even if you trust someone with your life and you know this person is never going to betray you and will always have your best interests at heart, that doesn't mean that they automatically get a free pass to view and inspect everything I do every minute of every day until I die.
Unfortunately, that is what these governments want.
The problem is the average person doesn't care very much or understand it.
If you ask anyone if privacy matters they will of course say yes. If you ask them why they use software with telemetry or websites with Google Analytics they will simply shrug.
If you ask them if it's alright for the NSA to collect and analyze data from everyone they will say yes and they have nothing to hide.
People don't know what privacy is. They don't know what they are fighting for or where the fight is taking place.
If you take that and then add encryption to the mix... and you have politicians and agency plants talking about "saving the children from online pedos" by banning these "encryption apps and technology"....
>People don't know what privacy is.
You nailed the problem. Privacy is the tension between freedom and overwatch. Perfect privacy would yield zero justice, while zero privacy yields big brother/1984 overwatch. A healthy balance must exist for society to thrive.
"Secrecy of correspondence" is a longstanding legal principle in many countries (e.g. in Germany since the unification in 1871, in the US there was a supreme court ruling in 1877)
The only way to guarantee secrecy is through encryption, preferably e2e.
It’s honestly annoying how often experts speak up about this, and still nothing changes. We’re stuck in the same cycle—fear gets in the way, and in the end, it’s our privacy and security that suffer. If anything, this should be a sign to invest in stronger encryption and better law enforcement tactics that don’t mess with the tools keeping us safe online.
Imagine how much more successful and productive humanity would be if we weren't constantly being told to fear our neighbors.
Also... we're throwing around words like "crime" and "terror" and talking about shadowy quasi-governmental organizations encroaching on civil rights to privacy. I offer this commentary from the Eurythmics' score to Michael Radford's 1984 film "1984" to serve as background music for our discussions.
https://youtu.be/IcTP7YWPayU
There's an abstract argument template that I've noticed floating around. It goes like this:
I've heard it many times before. Reading this post feels like watching a rerun of Friends.Are you saying that this template is what the article is presenting?
If so I don't believe it applies, in particular because you have stated that only a partial compromise on C is needed to prevent Y and Z.
There is no "partial compromise" on encryption, so this argument is flawed. There is no way to have encryption that "only the good guys" can break. It is either secure, or it is not.
C isn't encryption :(
My favorite version of it is "Let's ban air because terrorists breathe".
I've usually seen it phrased as "let's ban wheels|cars so bank robbers can't escape!".
But well, even that rebuttal is getting tiresome. It's the same people that keep pushing for banning air again and again. They control all the communication channels, so nobody can ever rebut them in a forum that matters, they control the governments, and they are still not popular enough to make that thing pass. Yet, they keep pushing for it.
I don't think we'll solve this by talking about this. We need to talk about systemic corruption instead. (But then, they control the communication channels...)
From my experience this kind of problems are avoided to be solved simply because the difficulty of them is crystal clear and usually there are no champions who are willing to push it to the very end.
I love strawmen, but I've seen way too many in my time.
Guantanamo Bay was a thing though. Remember you are not banning all air.
And of course the definition of terrorist is will vary based on what politicians want. US recently sent some "Terrorists" to a gulag for example.
> Remember you are not banning all air.
Nope, it's about exactly that. This policy would work only for law-abiding citizens which terrorists are not, that's the point.
Added: The current gulag expeditions in US nor Guantanamo have nothing to do with US citizens, which is a big difference from GGP's comment.
That's a template, yes. But why is it bad?
Because only the first part adds something to the discussion. Starts with the the problem, then goes about only one of the possible solutions (which usually is the low-hanging one), states why it's bad and ends with refuting the existence of a problem.
Seems to be geared towards Apple, but informative nevertheless.
To me, the only sure end-end encryption is gnupg, where you personally create the keys and distribute.
Not a crime, but somehow our dear EU overloads try every year or so to make it a crime in any way possible (eg. chat control).
If we want to play in a world with full transparency, let's start with the politicians!
And also apply it equally to ecommerce and homebanking.
Lets see how happy the voters are when they have to start walking to their Bank again every week, can't order their latest temu toxic waste product anymore and their GDP drops in half.
And like always they claim its to protect our children... Who could possibly argue against protecting children?
Why do you need encryption? Are you sending pedo photos? Are you a pedo? Looks guys, here's a pedo who wants encryption not to get caught!
/s
Also 's/pedo/terrorist/', or {russian|chinese|iranian|north korean} spy or any "bad guy of the day".
https://en.wikipedia.org/wiki/Four_Horsemen_of_the_Infocalyp...
The same people who want to make encryption a crime (like Trump 45[0]) are using signal to discuss sensitive information without an audit trail. It's absolutely rules for thee.
0 - https://www.politico.com/story/2019/06/27/trump-officials-we...
Same with Chat Control. LEO and EU Mps would be exempted from being surveilled because their lives and communications need to be private since they are very important but yours, god no!
And people wonder why democracy is out of style. With democrats such as these, you don't need tyrants.
I believe encryption is the most important 2nd Amendment issue of our time, but I never see it framed that way.
Because SF-dwelling tech bros demand free speech but can perform the necessary mental gymnastics to overlook the right to manufacture and possess technology that has existed for over a century.
See also: the ACLU.
This kind of reminds me about the same sort of assertion that BitTorrent is not illegal.
Encryption is a threat to power structure. Of course if you're in power, and you're under threat, you criminalize threat.
As long as we preserve the knowledge of one-time pads, they will not take this power from us.
> Ignoring experts doesn't make facts disappear
And yet it seems like every last politician without literally a single exception thinks that they it does work that way.
As a software engineer who specialized in cryptography in the 1990s and didn't work for the NSA (working for RSADSI, Bell Canada and Certicom) I feel I have an informed vantage point from which to offer notes.
a) This seems like a decent introduction to the subject of cryptographic regulation in the last 30 years. It's far from exhaustive, however. I do appreciate the collected references from diverse points in the last several decades.
b) I would have mentioned "Sink Clipper" and the ACLU "dotRights" campaigns. Neither are especially easy to find in the increasingly enshittified google cache, but Le Monde Diplomatique has this article, complete with a link to Sink Clipper poster (I think from the mind of Kurt Stammberger) that no collection of CypherPunk oriented ephemera from the era can be without: https://mondediplo.com/openpage/selling-your-secrets
The ACLU dotRights.org site seems to have receded into history, but some of it's content is still available at the archive. For example: https://web.archive.org/web/20100126102126/http://dotrights....
c) Herb Lin presented a very nice paper back in the day comparing PROPOSED encryption regulation with ACTUAL encryption regulation. I think the thesis was through the 90s, proposed regulation was increasingly draconian (clipper, etc.) but actual regulation was liberalizing (effective deregulation of open-source tools.) I found Herb's page at Stanford and heartily recommend it if for no other reason than it's sheer volume of written material: https://herblin.stanford.edu/recent-publications/recent-publ...
d) I was a little surprised the wired article linked to at the beginning of the piece didn't have that issue's front cover, which was sort of a cultural touchstone at the time. But you can see it here: https://pluralistic.net/2022/03/27/the-best-defense-against-... - and this one: https://www.reddit.com/r/Bitcoin/comments/1cgpktp/31_years_a... (dang, look at those non-receding hairlines!)
e) Making the web "secure" or "private" is like putting lipstick on a pig. Modern web technology is designed to de-anonymize and collect identifying information to enable targeted ad delivery. Thought I generally respect Moxie Marlinspike and have no great beef with Signal, there has been a concerted effort to exploit its device sharing protocol and your carrier and national governments can easily extract traffic analysis info from people using it. Were I to add one sentence to this guide, it would be "While these tools are better than nothing, they are far from perfect."
f) The guide seems to conflate encryption with privacy. Encryption technology can enable privacy, but you're not going to get privacy from encryption technology unless you pair it with well reasoned policy (for organizations) and operational guidelines (for both organizations and individuals.)
The extreme example is to say "nothing stops a participant in an encrypted communication from sharing the un-encrypted plaintext after it's recovered." People earnestly trying to maintain message security probably know not to do that, but when talking about exchanging keys and figuring out which keys or organizations you should trust, it's easy for even the well-informed to make privacy-eroding decisions.
So... I think this article is a good jumping off point, covering material I would call "required, but not sufficient." I would just view it as the beginning of a deep-dive instead of the end.
Just a bit more color on where this war on encryption is currently being fought:
https://community.qbix.com/t/the-global-war-on-end-to-end-en...
[dead]
[flagged]
Playing devil's advocate here...
What is wrong with:
* an expiring certificate
* issued by the device manufacturer or application creator
* to law enforcement
* once a competent court of law has given approval
* that would allow a specific user's content to be decrypted prior to expiry
There are a million gradations of privacy from "completely open" to "e2e encrypted". Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes. Politicians are (mistakenly) asking for a master key - but what I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
Competent jurisdictions allow this for physical search and seizure. It's not unreasonable to ask for the same thing to apply to digital data.
The first thing that's wrong is the principle - we should have a right to try to preserve our privacy. When even trying to hide is a crime, you live under tyranny.
The second thing that's wrong is the practice - despite the "going dark" panic spread by intelligence agencies, we have far, far less privacy than at any prior point in history, and spying on people, even people trying to hide, is much, much easier. So why the hell must we make it even easier still??
I don’t think this particular devil needs more advocacy.
Law enforcement agencies currently have more data about each of us and more sophisticated tools to investigate crimes than at any time in human history.
> Politicians are (mistakenly) asking for a master key - but what I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
The problem with all backdoors is the human element. Master keys will be leaked. A process to gain access to a temporary key is also subject to the human factor. We’ve already seen this happen with telecom processes that are only supposed to be available to law enforcement.
The other issue is one of a legitimately slippery slope. The asymmetric nature of the power dynamic between governments and their citizens makes it even more critical to avoid sliding down that slope.
And finally, in the environment you propose, criminals will just stop using services that are able to provide such services to the government. Criminality will continue while ordinary citizens lose more and more of their rights.
Well that's your view - but these demands aren't going to go away, and what I think is sensible is for us a technical community to consider reasonable alternatives. Every society is a compromise between anarchic freedom and authoritarian tyranny, and this is another discussion about how a (relatively) new set of technologies can fit into that compromise in a way that is acceptable and reasonable.
I acknowledge the problems you raise, but it does seem to me that we have a good set of systems in place in the form of PKI that has a remarkable amount of flexibility.
It's frankly a bit of an article of faith in our community that encryption == unalloyed good and I think we'd be right to think more critically about that position.
> but these demands aren't going to go away
To me, this just means that we must remain vigilant. The slow creep towards authoritarianism isn’t going to go away either. The solution is not to look for reasonable ways for authoritarian rules to exist. Continuous harmful pressure must be met with continuous resistance.
> Every society is a compromise between anarchic freedom and authoritarian tyranny
Except not every society is such a compromise. Some are fully under authoritarian control, and serve as a warning for others who are tempted by authoritarian ideas.
> this is another discussion about how a (relatively) new set of technologies can fit into that compromise in a way that is acceptable and reasonable.
Breaking encryption need not inherently be part of that compromise. And until someone can explain how breaking encryption will actually stop the kind of bad actors used to justify such a direction (vs. driving them deeper underground, i.e. even if you outlaw encryption, it’s not as if law breakers will obey such a law), I see no merit in entertaining such a compromise. The crimes being committed are already illegal.
> It's frankly a bit of an article of faith in our community that encryption == unalloyed good
I don’t think most people in our community see it as inherently/perfectly good, but as extremely important and necessary. This is a critical distinction. As with everything, there are harms that come with the good, and such is the nature of all things. The question becomes: are the harms allowed worse than the good that is preserved? And would the new harms of disallowing the status quo be potentially worse than the harms supposedly prevented?
> I think we'd be right to think more critically about that position.
I agree that we need to think critically about this. But clearly we disagree about what one should conclude from such a critical analysis. I’d argue that taking the position that the government needs more power - especially at this moment in history - is the result of not thinking critically enough.
> Except not every society is such a compromise. Some are fully under authoritarian control, and serve as a warning for others who are tempted by authoritarian ideas.
Every society is on a continuum, and so represents some compromise between freedom for the citizen and power for the authorities. No society is perfectly free and no society is entirely authoritarian.
> Breaking encryption need not inherently be part of that compromise. And until someone can explain how breaking encryption will actually stop the kind of bad actors used to justify such a direction (vs. driving them deeper underground, i.e. even if you outlaw encryption, it’s not as if law breakers will obey such a law), I see no merit in entertaining such a compromise. The crimes being committed are already illegal.
Being able to legally access a private citizen's encrypted data in specific situations would help to (at least more rapidly) prosecute certain crimes more successfully. This is, I think, inarguably true. You can decide for yourself if that is worth a compromise. I'm somewhat on the fence.
> I don’t think most people in our community see it as inherently/perfectly good, but as extremely important and necessary. This is a critical distinction. As with everything, there are harms that come with the good, and such is the nature of all things. The question becomes: are the harms allowed worse than the good that is preserved? And would the new harms of disallowing the status quo be potentially worse than the harms supposedly prevented?
I think it's convenient and useful, but I hardly think it's necessary. Society managed to function just fine (although less conveniently) when strong encryption wasn't available for communications. Banking still happened, money still changed hands.
> I agree that we need to think critically about this. But clearly we disagree about what one should conclude from such a critical analysis. I’d argue that taking the position that the government needs more power - especially at this moment in history - is the result of not thinking critically enough.
It depends on how you define power. As society changes and new technologies emerge, maintaining existing government authority in new areas - and working out ways to ensure that authority is maintained - isn't really giving governments more power, but trying to ensure your society remains in the agreed location on the freedom/authority continuum.
If you see this as expanding powers, I can see how you would consider that a problem. But I think this is more about ensuring existing power is maintained correctly over a new area where crime is being committed.
Playing devil's prosecutor, I would say that technology has simultaneously made telecommunication a nearly constant part of life while also enabling mass surveillance on a global scale, and the process hasn't reached an endpoint. The result is an extremely slippery slope from "targeted lawful intercept" to "AI assisted sentiment analysis of every iMessage". Or in the future, everything seen by your AR glasses, every thought encoded by your neuralink chip...
Your limited lawful intercept example is reasonable to most, but as you yourself acknowledged, that's not what politicians are seeking. Therefore even if the community supports and enables "just that", politicians will eventually demand their wildcard cert. It will be a national emergency, after all.
Prior to expiry would suggest the encryption is broken from the start.
Although I do disagree on the reasonable/unreasonable angle, because I don't tend to analogize the contents of your phone to the contents of your safe, but rather to the contents of your mind.
Well I get that a significant part of our lives is wrapped up in our phones nowadays, but I still try to preserve a safe haven between my ears...mostly...
If the OEM can issue such a certificate, it probably isn’t necessary, because they can access the data and be subpoenaed directly, no?
Yes, I am arguing for is requiring OEMs to implement this mechanism.
Frankly, if the NSA wanted to have Apple build a custom iOS version for a criminal so they could sniff his network traffic and flash content from the comfort of Maryland I don't believe that would be impossible today.
If an OEM could decrypt a users data, a government typically won’t bother to do it themselves. They’ll just use legal mechanisms to require the OEM to do the work for them.
Again, as a thought experiment, what legal protections can we put in place - an encryption ombudsman or independent authority - that would allow an arms length, controlled and expiring mechanism that allows limited access to a user's data? What would we as a society be happy to accept? I don't think the demand is an unreasonable one, but I'm trying to figure out what a reasonable collection of mechanisms looks like.
I think that’s more of a technical question. How would that entity be granted the ability to decrypt your data without the OEM being able?
This requires the device manufacturer to have the capability to decrypt the data (to be able to do so when all this process is properly observed)
If they have the capability to decrypt the data, a court can compel them to do so, disregarding the process you suggest. A cyberattack could achieve it without a court order.
This can't be solved technically.
It requires decryption by means beyond the sole supply of a user-owned key. That doesn't require a manufacturer to be capable of decrypting it.
I suspect that there are many ways that can be achieved, all technical ;-)
describe one.
Because it is not realistic to except a government to always be "good". Courts are just going to rubber-stamp warrants, like they have done with present-day "lawful interception" warrants. And the keys are inevitably going to leak, if they are used routineously to investigate common crime.
Apple's iOS firmware encryption key hasn't leaked, as far as I am aware. Do you know otherwise?
There are already very good solutions for ensuring that key leakages are very difficult to do and limited in effect.
is that very good solution "manage your own keys and don't give a special one to the government"?
That expiration is impossible to enforce. If you have the data and the cert, you can use it whenever you'd like, and the only thing preventing you from doing so is some piece of software voluntarily choosing to comply.
What that means is, there exists a master key in your scheme.
Certificates expire right now. It's in the schema for how PKI works. Why can't the issued cert expire in the same way?
Users can ignore the expiration date on a TLS cert. Cryptography doesn't enforce time constraints, business logic does.
somewhere a piece of code would have to say "here I've got this key, which can decrypt this text, but I'm not going to" and that decision is not protected by math.
I'm not sure I follow. Obviously the application itself needs to support the business logic described, in the same way as your web browser needs to notice that a certificate has expired and tell you there's a problem with a website you're visiting. What I'm exploring is why requiring certain applications to support the same sort of thing to decrypt user data in certain circumstances to support law enforcement is a problem.
the closest you can get to what you want is a trusted third party who would help derive the final key. so the key could not be revealed to law enforcement without cooperation of the trusted third party who would verify policies like time, etc. it may also be possible to have the 'trusted third party' be a piece of tamper proof hardware. i think generally people are suspicious of these schemes because it relies on 'trust'.
also, i think apple has a scheme similar to this for protecting the passcode from being brute forced when recovering from iCloud backup. however, if this scheme breaks it doesn't reveal the encryption key i believe it just allows the passcode that protects the encryption key to be brute forced which I guess may or may not result in the encryption key being revealed.
The problem is that nefarious actors aren't physically barred from the data. If China, Big Balls, Zuckerberg, or anyone else want to access that data then they can just remove that check.
More importantly, the thing you're asking for (law enforcement retroactively snooping without there existing a master key) is always impossible.
For other forms of snooping (like a warrant to tap communications for a single device for a period of time), you have related issues. Suppose you magically make such a thing flawless -- the client can't detect intrusion, a single key is actually time-bound, etc. There still exists a group of people with the power to hand out such keys, and that power, however it's implemented, is still a master key to all future communications over that protocol.
You can partially mitigate the risk in various ways, but you can't eliminate it. Every proposal for weakening cryptography in that way has had glaring flaws, and many known attempts at actually weakening it have later been cracked by nefarious actors. Spying, but only for the "good guys," should be met with extreme skepticism as far as cryptographic protocols are concerned.
For all of these schemes, what happens when the people holding keys and power are physically forced out (DOGE et al)? Even if we assume the thing is implemented flawlessly, the people involved never leak anything, the master keys stay secret, ..., you still have the human problem of transitions in power. Do you want the current US administration, one currently arguing that it can "deport" actual citizens to torture prisons with no recourse or court case, to know that six years ago your daughter confided to her best friend that she got an abortion once? That she doesn't believe Israel should be committing genocide? Or, suppose you approve of the current administration, what about the next one that takes the reins with this new set of powers? It's bad enough without decades of chat history to let 70%-accurate AI comb through and make deportation decisions.
Am I allowed to keep a secret?
Maybe I am not allowed to write it down and also keep it secret.
Yes, but if a court decides you have committed a crime, and law enforcement show sufficient cause to obtain a warrant, they can seize your secret and - if it's relevant - use it to show your state of mind when the crime was committed.
Did you not read 1984?
Yes, several times.
But in the same way that as a society we allow physical privacy (and freedom!) to be removed under certain circumstances, we should consider allowing digital privacy to be removed in the same way. 1984 imagines a world where the authorities can enter your physical space at any time because they feel like it. But I don't lie awake worrying about that because I live in a society where I feel the social contract is largely upheld by the authorities.
> issued by the device manufacturer or application creator
The problem is that if the application has the power to do this then the rest is irrelevant
The means hackers/governments/the CIA can force the application creator to do their bidding and enable mass surveylance
I don't accept that. We have "master keys" for some forms of encryption right now in the form of root certificates; knowing that root cert authorities could issue certificates that might allow people to sniff my network traffic doesn't keep me awake at night.
To reduce any risks, almost everything PKI-related is conducted in public, is auditable by anyone, and is a cooperation between dozens of distinct entities located globally.
This is not analogous to a single government having non-transparent, non-auditable access to decrypt communications of its own citizens.
Then setup the system to be more analogous. Make the publication of key issuance under this mechanism public after a period of time.
Again, I see us falling back into an "all or nothing" view of privacy and I just don't think those are the only options.
>Then setup the system to be more analogous. Make the publication of key issuance under this mechanism public after a period of time.
That (somewhat, barely) addresses one of ~dozen issues with the proposal.
>Again, I see us falling back into an "all or nothing" view of privacy
Not to be too pedantic, but I think the distinction between privacy and encryption is incredibly important: almost everyone agrees that privacy is a gradient. The disagreement is whether or not encryption can be a gradient. Most people do not think it can reasonably be without undermining ~everything relying on it.
I get the hostility towards it - as I've said elsewhere, it's practically an article of faith in our community that strong encryption == unalloyed good. And clearly it needs a lot of thinking to address potential abuse. But we've done it for other things.
> Not to be too pedantic, but I think the distinction between privacy and encryption is incredibly important: almost everyone agrees that privacy is a gradient. The disagreement is whether or not encryption can be a gradient. Most people do not think it can reasonably be without undermining ~everything relying on it.
That is a fair criticism. I would answer that by saying that encryption is just a technology, and you can employ it in very flexible ways (including e.g. n-of-m style keys) which if thought through well and legislated carefully could give the authorities more reasonable access to data when it is legally warranted.
Not really. You can get around with pinning public keys like IoT devices and Tor and i2p do
A proposal to backdoor all cryptography is worse than having pki as a think we opt in to for the sake of convenience
That sounds like a golden key approach, and the problem is your communication is no longer protected by math, it's only protected by the will of a stranger to be tortured by the government to protect you
https://www.rsaconference.com/library/blog/a-golden-key-to-u...
The back and forth discussion on cryptography is happening because there just isn't much middle ground. Either someone else can read your messages, or nobody else can. If one person can read them, the government will push on then until they crack.
No, I don't accept that this is the case any more than the root certificate system is a golden key. I'm quite sure that Apple can issue me a certificate that allows me to build a custom version of iOS that can be flashed onto my phone; why doesn't the same thing apply to other things?
The two things you are talking about are very different. One is signatures and one is encryption.
Well if decryption is so justified then brute force breaking that takes significant resources so it's hard to unnoticeably misuse would be a good approach. When you can only break into 100 phones a year, then there's no slippery slope or fascist governments that could wildly misuse it for their own gain because it's not physically viable.
Ok! That is a sensible rejoinder. Make a proof-of-work system that limits the authority from making more than n requests in a space of time. A good brake on abuse.
> Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes.
For starter I don't know a lot of good governments. So you'll have to define how you differentiate between a good one and a bad one.
> Governments (good ones!) are rightly complaining that criminals are using encryption to commit particularly awful crimes.
Secondly, criminals use public transport and roads built with taxpayer money to commit crime. Some even say that they breathe the same air as us honest citizens.
They also live in homes with 4 walls that you can't see through either.
I am being facetious but you can see where I am going with this.
If you think that the governments will stop at spying on criminals once this backdoor is in place, then I have a bridge to sell you.
Do you want your kids to grow in world were everything they do online will be analyzed, categorized and reviewed by some random government employee somewhere?
What if this government turns bad in the future as it has happened countless times in the past? What do you do then?
> I feel we should as a community support is some fine-grained legal process that would allow limited access to user information if justified by a warrant.
The problem with this line of thinking is that it doesn't hold up in the real world. Once you grant access to something like say your browser history to the government or any entity, what's to stop them to ask for more next time?
It's not a big deal right, they can say, well you gave us access to A, now we want access to B. Then in 3 years they will come back demanding access to C, D and E until your entire privacy has been taken away from you.
And every time, they will use the same excuses, fighting crime, fighting drugs, child grooming and terrorism.
> Competent jurisdictions allow this for physical search and seizure.
That is not even remotely comparable.
In those cases, you need a judge or someone to approve the seizure. With a backdoor that can be opened at any time, you should consider that nothing will be private because there is no one who is going to be monitoring it 24/7 to make sure that there are no abuses.
> In those cases, you need a judge or someone to approve the seizure. With a backdoor that can be opened at any time, you should consider that nothing will be private because there is no one who is going to be monitoring it 24/7 to make sure that there are no abuses.
I'm not sure you've read what I wrote correctly. My hypothetical system would not allow the backdoor to be opened at any time, but it would require a certificate to be issued (derived from the manufacturer / application creator's root) that gives limited, expiring access on the production of court-authorised warrant, in exactly the same way a judge gives the police permission to enter your physical property.
Which governments are the good ones?
Is Indian government a good one, or Hungary's, or Turkish, German, or British, or the US? In the last case (well, in all cases), does "goodness" of a government depend on the current incumbent? What if a previously "good" government turns into an atrocious one?
See also: the detailed Dutch census, which was mostly harmless, until it fell into hands of the Nazis in 1940 and helped them to identify and exterminate almost all Jews in the country.
Every system of authority carries with it the risk of abuse; but we still accept legitimate authorities carrying out breaches of personal privacy for the sake of law enforcement - the warrant system being the obvious one. That's part of the compromise we make in society.
Good governments ensure that a breach of personal privacy has to travel through a legitimate process with an independent judiciary to limit the risk.
Do you think that this can be done without introducing massive security weaknesses into systems that cannot have them?
Also, there is a question if you believe the authorities that without decrypting data, they can't investigate crimes.
Imagine an analogical assertion that without torturing suspects, law enforcement is stymied. Someone might assert that, but we still say no, for all sorts of fundamental reasons. Same with American Miranda rights and others.
Myself, I don't believe in that assertion at all. Most crimes leave a massive real world trace that cannot be encrypted. The ones that don't, maybe should not be crimes in the first place.
> Do you think that this can be done without introducing massive security weaknesses into systems that cannot have them?
Yes, I do - or rather, that is the point of the discussion. We currently allow central authorities to indicate our permission to do or be something in the root certificate system. Why can't something similar be designed to allow controlled decryption?
> Also, there is a question if you believe the authorities that without decrypting data, they can't investigate crimes.
Clearly there are circumstances in which being able to decrypt the data of a criminal would assist in prosecuting crime. See EncroChat for an example of how this has worked.
> Imagine an analogical assertion that without torturing suspects, law enforcement is stymied. Someone might assert that, but we still say no, for all sorts of fundamental reasons. Same with American Miranda rights and others.
Yes. Clearly there are reasonable limits that need to be applied before we can allow controlled decryption of data. I am not arguing for issuance of a master key. See my original post.
> Myself, I don't believe in that assertion at all. Most crimes leave a massive real world trace that cannot be encrypted. The ones that don't, maybe should not be crimes in the first place.
Some do, and some don't. Things like e.g. cryptocurrency heists have profound effects, and are propping up North Korea. Those are definitely crimes...
> It's not a crime to lock your home's door for protection, why would it be a crime to lock your digital door?
A locked home's door is still trivially opened. You can pick the lock or even apply simple brute force, neither of which all that difficult, and open happily it will. Similarly, I don't suppose anyone would be concerned about you using rot13 encryption. If a home could be sealed to the same degree as strong encryption, absolutely it would be a crime, for better or worse.
Under what law? High security vaults are not legally controlled or prohibited in the US.
Which high security vault can the government not gain access to under any circumstances? I expect you'll find decent explosives or a bulldozer will get them in just fine.
So will a hardware backdoor planted by your maid, or a telescopic lens pointed at your screen, or laser microphone on your window, get them into your e2e encrypted chats.
Huh? The encrypted data is at rest and the only person who knew the key is dead. Your plan makes no sense.
Generally the set of people who are relevant in the debate of the balance of privacy rights and criminal prosecution, are living.
Dead people are distinctly immune to prosecution, and generally granted fewer rights.
If you intended to reply to a different thread and accidentally ended up here instead, there is truth to what you say, but it has nothing to do with this one.
As it pertains to this thread, where the sole key holder is dead and took the knowledge with him, how do you anticipate to carry out gaining access to the data using live attacks? There are plenty of reasons why the government wants access to data even where prosecution isn't necessary.
The overlap between data that was never shared and data that is relevant after the person was dead is excruciatingly small.
It is excruciatingly small in all cases, which is why laws haven't been crafted yet. But if that changes...
Is encrypted data at rest belonging to dead people such a problem, that it's worth sacrificing everyone's privacy?
> that it's worth sacrificing everyone's privacy?
Is the appeal to emotion really necessary? Surely we can discuss the facts without devolving into some kind of "But I want that!!!" toddler behaviour?
There was no emotion, only an accurate description of the consequences of the proposed policy.
I said appeal to emotion. There is no logical foundation for the question. It can only be asked in the context of how one arbitrarily feels about the subject, serving no purpose, and derailing was taking place for no good reason. How you one feels about the subject has no impact on what we are talking about and trying to draw a link between them is completely nonsensical.
The issue isn't "gain access to" - it's "gain access to without destroying the contents."
Explosives and bulldozers are likely to harm whatever was motivating the entry in the first place. The vault system can be engineered to ensure this conclusion, as well.
The question specifically asked which one(s) you are talking about, not about your dreams. Which high security value is the government not able to gain access to?
And, sure, if enough perfectly engineered vaults were impeding the government from carrying out the activities it wants to carry out, there would be calls to make building/using such a vault illegal too. In the real world, such vaults, if they exist at all, don't meaningfully get in the way. Thus there is no reason to think about it. We don't create laws on what theoretically might be a problem in some magical imagined world. We only create laws after something is identified as an actual problem.
> The question specifically asked which one you are talking about, not about your dreams. Which one is like that?
After your five ninja edits, it's been hard to keep up:
Glass relocker mechanisms have existed (in reality) on safe doors for decades and will often result in the destruction of contents if triggered and opening is still required.
Governments are normally seeking evidence: a stack of cash or a quantity of bulk substances are substantially harder to rig to destroy (obviating evidence gathering) than documents or data.
> After your five ninja edits, it's been hard to keep up
No need to reply within the first second. Take your time.
In fact, consider taking a lot more time as you still haven't named the specific vault, or set of vaults, that is causing such a big problem for the government. If we don't know what vault it is, even if your description is vague, how would anyone come to think of it as a problem? Laws are not created by some all-knowing deity. It is just people.
That such a vault might be theoretically possible to build is irrelevant.
> as you still haven't named the specific vault
Vaults and safes are boutique products. Glass relockers have been sold for decades - can you not extrapolate that heat and impact might destroy something inside of a highly thermally-conductive container?
HSMs and similar tech have had tamper detection systems for decades with internal battery backups.. these aren't illegal yet. My server cases from 20 years ago had tamper switches for exactly this purpose. How hard is this stuff to engineer?
> Glass relockers have been sold for decades
And...?
Let me ask again: Which vault(s) are currently, or at least in recent enough memory for anyone to recall, causing great strife for the government? Even a rough location would be sufficient. We can offer that in the case of encryption. There are countless news articles about police not being able to decrypt data they deem important.
Without that, it doesn't matter. Laws are not created based on imagined situations that you can dream up. They only are created after something has become a problem. You can use a perfectly impenetrable vault all day long and as long as the government doesn't want in, it is never going to care.
Of course, the greater subject is really about houses, not vaults. The government has good reason to want to get into your house. For example, you might perish in it, and it needs to get in to deal with your mess. This is a relatively frequent task placed upon government to carry out. If you've made your house impenetrable, government isn't going to remain amused for long. If the government starts encountering that problem often, absolutely it would become illegal.
It is not illegal today because it has never posed a real problem.
With physical access, global superpowers can break both vaults and strong encryption.
That analogy breaks because a home's locked door as the constaint that it can effectively only be visited by someone coming to that door physically. On the internet, multiple crimimals can attack all doors at all times
https://www.youtube.com/watch?v=VPBH1eW28mo
This!
Scalability is the crux of why encryptions must not be infringed.
The claim that LEOs need to break encryption is based on laziness: they want to easily obtain access to communication, and at scale. They've always been able to obtain communication the hard way, and one-at-a-time - encryption doesn't change that.
So in general, shit security is legal, good security is a crime?
A warehouse with shutters and bulky padlocks, a night security guard and camera system is a crime? A bank vault is a crime? Safety deposit boxes?
> A warehouse with shutters and bulky padlocks, a night security guard and camera system is a crime?
No, why would it be? The security guard isn't going to wage war with the police/military when they want in. The guard will politely comply to any legitimate (and probably even illegitimate) request for access.
> A bank vault is a crime? Safety deposit boxes?
Banks are heavily regulated by the government. They especially aren't going to impede access if push comes to shove.
Laws aren't created on purely theoretical grounds. They are created only when a problem that needs to be solved is identified. The government has never had much trouble accessing physical spaces when they feel a need to. They have had trouble accessing encrypted data.
Ok you make good points. Now for the doozy: is thinking (without transcribing for the government) illegal.
> is thinking (without transcribing for the government) illegal.
Thinking without a willingness to share what you thought with the government when it feels it needs to know (e.g. in court) is illegal. Full transcription is not always legally required, but it is in some specific contexts where there have been problems getting proper disclosure. Again, laws are created to deal with actual problems, not imagined problems.
I'll note that encryption isn't illegal today. While there are some outlier cases where it has been a challenge to government, it hasn't become a big enough problem to do anything about yet. But if it reaches the point where it is deemed sufficiently problematic, it will become illegal in some kind of fashion. What that looks like is obviously to be seen. It won't necessarily be a blanket ban on all encryption, or even a ban on encryption at all, but most people are not capable of imagining anything else, so here we are.
I'm not surprised, UK became literal African/Middle-East hell hole. They've kicked out all working immigrants and replaced them with ultra-religious freaks.
And of course, UK being a country, where every form of self-defense is the most serious crime, when attacked you must call police, then lay on the ground and die, is cherry on top.
>>I'm not surprised, for years UK become literal African/Middle-East hell hole.
I wonder where in the UK you live, because up here in the North that definitely doesn't seem right - it's rare to see anyone non-white on the street.
It's pretty universally known that where you live could now be considered a microcosm relative to the rest of the country. Although, I wouldn't necessarily have used "hell hole" to describe the rest.