“Telegram is not a private messenger. There’s nothing private about it. It’s the opposite. It’s a cloud messenger where every message you’ve ever sent or received is in plain text in a database that Telegram the organization controls and has access to it”
“It’s like a Russian oligarch starting an unencrypted version of WhatsApp, a pixel for pixel clone of WhatsApp. That should be kind of a difficult brand to operate. Somehow, they’ve done a really amazing job of convincing the whole world that this is an encrypted messaging app and that the founder is some kind of Russian dissident, even though he goes there once a month, the whole team lives in Russia, and their families are there.”
" What happened in France is they just chose not to respond to the subpoena. So that’s in violation of the law. And, he gets arrested in France, right? And everyone’s like, oh, France. But I think the key point is they have the data, like they can respond to the subpoenas where as Signal, for instance, doesn’t have access to the data and couldn’t respond to that same request. To me it’s very obvious that Russia would’ve had a much less polite version of that conversation with Pavel Durov and the telegram team before this moment"


Not at all. Not everyone needs to audit open source, only a few interested experts do. Most importantly, auditing is possible because its out in the open.
The just trust me model of signal means its impossible to audit, unless they give us their centralized database and server code.
If you are not auditing the source code, you are trusting those that are.
I not sure what you are trying to argue.
Even if you audit the code yourself, you still need to trust your OS, you need to trust the hardware the OS is running on, and you need to trust the proprietary drivers of each component in that hardware. Then at that point you gotta trust the person who sold you the hardware hasn’t modified it.
Ok and?