Telegram was built to protect activists and ordinary people from corrupt governments and corporations – we do not allow criminals to abuse our platform to evade justice.
So who gets to pick what’s a lawful request and criminal activity? It’s criminal in some states to seek an abortion or help with an abortion, so would they hand out the IPs of those “criminals”? Because depending on who you ask some will tell you they’re basically murderers. And that’s just one example.
Good privacy apps have nothing to hand out to any government, like Signal.
Exactly. Which is the entire reason you should do it. Since you can’t sell your customers for profit, that means you have to profit off of your customers. And another business could start up and compete with you. Also, your customers will trust you more.
The second I went to sign up and learned a phone number was absolutely required, I knew that their privacy was pure bullshit. That little declaration at the end here is an absolute slap to the face.
It’s bad for privacy no matter how you sell it. Unless you have a good amount of disposable income to buy up burner numbers all the time, a phone number tends to be incredibly identifying. So if a government agency comes along saying “Hey, we know this account sent this message and you have to give us everything you have about this account,” for the average person, it doesn’t end up being that different than having given them your full id.
Another aspect is the social graph. It’s targeted for normies to easily switch to.
Very few people want to install a communication app, open the compose screen for the first time, and be met by an empty list of who they can communicate with.
By using phone numbers, you can message your friends without needing to have them all register usernames and tell them to you. It also means Signal doesn’t need to keep a copy of your contact list on their servers, everyone has their local contact list.
This means private messages for loads of people, their goal.
Hey, we know this account sent this message and you have to give us everything you have about this account
It’s a bit backwards, since your account is your phone number, the agency would be asking “give us everything you have from this number”. They’ve already IDed you at that point.
Yep, at that point they’re just fishing for more which, hey, why wouldn’t they.
It’s a give and take for sure, requiring a real phone number makes it harder for automated spam bots to use the service, but at the same time, it puts the weight of true privacy on the shoulders and wallets of the users, and in a lesser way, incentives the use of less than reputable services, should a user want to truly keep their activities private.
And yeah, there’s an argument to be made for keeping crime at bay, but that also comes with risks itself. If there was some way to keep truly egregious use at bay while not risking a $10,000 fine on someone for downloading an episode of Ms. Marvel, I think that would be great.
Says right there in the subpoena “You are required to provide all information tied to the following phone numbers.” This means that the phone number requirement has already created a leak of private information in this instance, Signal simply couldn’t add more to it.
Additionally, that was posted in 2021. Since then, Signal has introduced usernames to “keep your phone number private.” Good for your average Joe Blow, but should another subpoena be submitted, now stating “You are required to provide all information tied to the following usernames,” this time they will have something to give, being the user’s phone number, which can then be used to tie any use of Signal they already have proof of back to the individual.
Yeah, it’s great that they don’t log what you send, but that doesn’t help if they get proof in any other way. The fact is, because of the phone number requirement, anything you ever send on Signal can easily be tied back to you should it get out, and that subpoena alone is proof that it does.
Guys like you see privacy as a monolith, that it never is. Unusable privacy is meanigless as email had shown. Privacy of communications does not mean privacy of communicators and usable authentication can be more important then anonymity.
And all this has to be realised on real-world servers, that are always in reach of real world goverment.
None of this is my opinion, it’s just how the world works LOL
Can you elaborate?
Not necessarily, but kinda. The gov typically need some sort of warrant and they need approval from the country they’re requesting it from.
Which Government?
Pardon my ignorance as this is my first time using the internet, but I am pretty sure that every Government on the planet does not use a universal set of laws or procedures for enforcement.
So in your world, journalists and activists trying to bring attention to human rights violations their country’s fascist government is committing in an attempt to bring in good change should be just fucked over right?
Because those governments label those people as “criminals” when they’re objectively not.
The gov typically need some sort of warrant, and they need approval from the country they’re requesting it from.
United States of America? Canada? North Korea? China? Australia? Saudi Arabia? South Africa? Brazil?
The point is the app was designed for secure communication, specifically from corrupt governments, which is why it is problematic to allow access to user data as long as the individual is breaking a law in that country.
Or to use the example from the top:
So who gets to pick what’s a lawful request and criminal activity? It’s criminal in some states to seek an abortion or help with an abortion, so would they hand out the IPs of those “criminals”? Because depending on who you ask some will tell you they’re basically murderers. And that’s just one example.
In the US, agents must petition a judge for a search warrant. If granted, the agent may then compel an IT company to produce. If they are able, they must comply. It isn’t up to the CEO to decide what he feels is right.
Look for services that allow your data to be encrypted, but it must also clearly state the service provider does not have the encryption keys – you do. Apple does this, I believe.
So who gets to pick what’s a lawful request and criminal activity? It’s criminal in some states to seek an abortion or help with an abortion, so would they hand out the IPs of those “criminals”? Because depending on who you ask some will tell you they’re basically murderers. And that’s just one example.
Good privacy apps have nothing to hand out to any government, like Signal.
Exactly. The strive for zero knowledge is the proper way to be going.
But then you can’t sell your customer’s data for profit. Even if you don’t now, you still have that option in the future.
Exactly. Which is the entire reason you should do it. Since you can’t sell your customers for profit, that means you have to profit off of your customers. And another business could start up and compete with you. Also, your customers will trust you more.
The second I went to sign up and learned a phone number was absolutely required, I knew that their privacy was pure bullshit. That little declaration at the end here is an absolute slap to the face.
deleted by creator
It’s bad for privacy no matter how you sell it. Unless you have a good amount of disposable income to buy up burner numbers all the time, a phone number tends to be incredibly identifying. So if a government agency comes along saying “Hey, we know this account sent this message and you have to give us everything you have about this account,” for the average person, it doesn’t end up being that different than having given them your full id.
Another aspect is the social graph. It’s targeted for normies to easily switch to.
https://signal.org/blog/private-contact-discovery/
By using phone numbers, you can message your friends without needing to have them all register usernames and tell them to you. It also means Signal doesn’t need to keep a copy of your contact list on their servers, everyone has their local contact list.
This means private messages for loads of people, their goal.
It’s a bit backwards, since your account is your phone number, the agency would be asking “give us everything you have from this number”. They’ve already IDed you at that point.
Yep, at that point they’re just fishing for more which, hey, why wouldn’t they.
It’s a give and take for sure, requiring a real phone number makes it harder for automated spam bots to use the service, but at the same time, it puts the weight of true privacy on the shoulders and wallets of the users, and in a lesser way, incentives the use of less than reputable services, should a user want to truly keep their activities private.
And yeah, there’s an argument to be made for keeping crime at bay, but that also comes with risks itself. If there was some way to keep truly egregious use at bay while not risking a $10,000 fine on someone for downloading an episode of Ms. Marvel, I think that would be great.
deleted by creator
Says right there in the subpoena “You are required to provide all information tied to the following phone numbers.” This means that the phone number requirement has already created a leak of private information in this instance, Signal simply couldn’t add more to it.
Additionally, that was posted in 2021. Since then, Signal has introduced usernames to “keep your phone number private.” Good for your average Joe Blow, but should another subpoena be submitted, now stating “You are required to provide all information tied to the following usernames,” this time they will have something to give, being the user’s phone number, which can then be used to tie any use of Signal they already have proof of back to the individual.
Yeah, it’s great that they don’t log what you send, but that doesn’t help if they get proof in any other way. The fact is, because of the phone number requirement, anything you ever send on Signal can easily be tied back to you should it get out, and that subpoena alone is proof that it does.
deleted by creator
Removed by mod
Removed by mod
Removed by mod
Guys like you see privacy as a monolith, that it never is. Unusable privacy is meanigless as email had shown. Privacy of communications does not mean privacy of communicators and usable authentication can be more important then anonymity.
And all this has to be realised on real-world servers, that are always in reach of real world goverment.
deleted by creator
In which country?
deleted by creator
In your opinion, all companies must disclose the personal information of customers whenever a Government says “This person broke the law”?
deleted by creator
Can you elaborate?
Which Government?
Pardon my ignorance as this is my first time using the internet, but I am pretty sure that every Government on the planet does not use a universal set of laws or procedures for enforcement.
deleted by creator
So in your world, journalists and activists trying to bring attention to human rights violations their country’s fascist government is committing in an attempt to bring in good change should be just fucked over right?
Because those governments label those people as “criminals” when they’re objectively not.
This may be of some use to you.
https://www.merriam-webster.com/dictionary/elaborate
United States of America? Canada? North Korea? China? Australia? Saudi Arabia? South Africa? Brazil?
The point is the app was designed for secure communication, specifically from corrupt governments, which is why it is problematic to allow access to user data as long as the individual is breaking a law in that country.
Or to use the example from the top:
In the US, agents must petition a judge for a search warrant. If granted, the agent may then compel an IT company to produce. If they are able, they must comply. It isn’t up to the CEO to decide what he feels is right.
Look for services that allow your data to be encrypted, but it must also clearly state the service provider does not have the encryption keys – you do. Apple does this, I believe.
Probably Telegram themselves. Durov was forced into exile by Putin.