|
|
Subscribe / Log in / New account

A malicious Pidgin plugin

The developers of the Pidgin chat program have announced that a malicious plugin had been listed on its third-party plugins list for over one month. This plugin included a key logger and could capture screenshots.

It went unnoticed at the time that the plugin was not providing any source code and was only providing binaries for download. Going forward, we will be requiring that all plugins that we link to have an OSI Approved Open Source License and that some level of due diligence has been done to verify that the plugin is safe for users.


to post comments

Reproducable builds

Posted Aug 26, 2024 22:27 UTC (Mon) by python (guest, #171317) [Link] (29 responses)

This incident seems like a good reason to try to make build reproducibility and verification a more common process. Requiring the source to be published AND to match the binary is a pretty hefty deterrent compared to what is currently considered acceptable.

Reproducable builds

Posted Aug 26, 2024 23:01 UTC (Mon) by pizza (subscriber, #46) [Link]

> This incident seems like a good reason to try to make build reproducibility and verification a more common process. Requiring the source to be published AND to match the binary is a pretty hefty deterrent compared to what is currently considered acceptable.

The only way to guarantee that is for the publisher to generate the binaries themselves, ideally in an environment that lacks external network access.

No source means no binaries can get created. Which is a problem for non-F/OSS stuff but that's probably not a use case that Pidgin cares about.

(This is the approach F-Droid takes, for example..)

Reproducable builds

Posted Aug 26, 2024 23:28 UTC (Mon) by KJ7RRV (subscriber, #153595) [Link] (6 responses)

Considering that Pidgin would have to rebuild the extensions to verify the build, would there be any benefit to having reproducible builds, as opposed to non-reproducible ones done by Pidgin?

Reproducable builds

Posted Aug 26, 2024 23:43 UTC (Mon) by shironeko (subscriber, #159952) [Link]

having reproducible builds reduces the amount you need to trust pidgin as well, it also make it obvious if say someone manage to get into their build servers.

Reproducable builds

Posted Aug 27, 2024 1:23 UTC (Tue) by kazer (subscriber, #134462) [Link] (4 responses)

You'll want reproducible builds to ensure no other component has been backdoored. Like compiler, for instance.
That does mean you need to have setup where you can confirm every component same way.

Reproducable builds

Posted Aug 27, 2024 8:05 UTC (Tue) by aviallon (subscriber, #157205) [Link] (3 responses)

Just build everything in a Nix environment.
You get the reproducibility of everything almost free.

Reproducable builds

Posted Aug 27, 2024 9:50 UTC (Tue) by intelfx (subscriber, #130118) [Link] (1 responses)

Funny you say that, when we have another article on the front page where a reproducible builds expert says that Nix has no relation whatsoever to magically making builds reproducible.

Reproducable builds

Posted Aug 27, 2024 11:02 UTC (Tue) by jhe (subscriber, #164815) [Link]

The "Just use X, it solves all your problems" style post are a menace.

Reproducable builds

Posted Aug 28, 2024 21:16 UTC (Wed) by Heretic_Blacksheep (guest, #169992) [Link]

From what I can tell, NixOS reproducibility is aspirational nor a guarantee at all even if you follow their how-to steps for doing so. There are very big caveats here that break the recommendation.

For reproduceable builds you need the original source code to be reviewable, all subsequent patches, verification nothing was altered in transit, someone to actually verify that there's no shenanigans in all that source code (they had to read it AND understand it, without making mistakes due tiredness or too much caffeine!), then compile it, then verify that that binary did indeed result from the source code blob and that the compiler itself didn't alter it, verify included external libraries didn't alter the function or object meanings. This is why "just use Y" recommendations aren't useful here. FAIK, no single distribution does all of these steps because it's too (skilled) labor intensive. Otherwise you have a problem where a binary object file is "reproducible" but that term is completely meaningless because no one knows if there's malware in the original source code or the compiler's output.

Isn't constructive paranoia grand? That doesn't even get into this problem - what if the reproducibility tech has been suborned?

Reproducable builds

Posted Aug 27, 2024 10:13 UTC (Tue) by atnot (subscriber, #124910) [Link] (20 responses)

I think it's also a good reason to invest in better sandboxing/less privileges for plugins.

The fact that a pidgin plugin that you can just install off of the official site is even capable of taking screenshot and logging keys is what's actually alarming and irresponsible to me, the fact that this actually happened is just an inevitability.

Reproducable builds

Posted Aug 27, 2024 10:45 UTC (Tue) by ibukanov (subscriber, #3942) [Link] (7 responses)

I would rather see more progress towards a very restricted sandbox for GUI applications on Linux with services like screen capture/recording or even file open/save dialogs provided by OS itself with no general access to files from the sandbox.

Flatpak sandboxing

Posted Aug 27, 2024 15:59 UTC (Tue) by gnoutchd (guest, #121472) [Link] (6 responses)

That sort of sandboxing is a long-term goal of Flatpak, and IIUC the file-dialog APIs of the major toolkits already support their "portal" mechanism for controlled file access. I believe there are similar mechanisms to control screenshot/screencast access on Wayland desktops.

That said, for various reasons many important applications are not meaningfully sandboxed yet (for which they've been harshly criticized). Also, as others have discussed, it's not clear if that kind of sandboxing would have helped in this case.

Sandboxing this malice is not easy

Posted Aug 27, 2024 17:31 UTC (Tue) by farnz (subscriber, #17727) [Link] (2 responses)

The trouble here is that what you want to permit is for this plugin to take screenshots and transmit them over the network to the places the user intended them to be sent to, but not transmitted over the network to places the user did not intend them to be sent to. And that's a really hard problem to solve in a sandboxing mechanism - if the screenshots can't leave your system, then it's secure, but not useful, while if they can leave the system, how do you distinguish "going to the IM contact the user intended" from "going to the IM contact the plugin author has secretly chosen to send things to"?

This is a general problem in securing networked compute - how do I permit non-malicious communication, without permitting malicious communication - and it's arguably impossible to solve, since it requires understanding not just what is happening, but also what the user expects to happen.

Indeed, but we can at least sandbox the keylogger

Posted Aug 27, 2024 19:08 UTC (Tue) by gnoutchd (guest, #121472) [Link] (1 responses)

Agreed. Yeah. a general-purpose sandbox probably can't help with that. You'd probably have to put the plugins in a special-purpose sandbox with no TCP/IP access and expose a high-level message-sending API that you could reasonably filter on. Dunno if that would even be practical here.

That said, a good Flatpak-like sandbox should limit what a keylogger could collect, and IMHO that's pretty valuable already.

Perfect sandboxing probably impractical here - but let's not let perfect be the enemy of better

Posted Aug 27, 2024 19:44 UTC (Tue) by farnz (subscriber, #17727) [Link]

Even a high-level message-sending API is impractical here - I, as the malicious attacker, come up with a plausible reason for the plugin to send messages to an account I control and that I also use for data exfiltration. Similar applies to anti-keylogging protection; I come up with a good reason why I "need" the extra permissions to collect all keypresses, not just a shortcut key that triggers the screen capture, and many people will click OK blindly to get the plugin to function.

That said, having a sandbox that makes the attacker's life harder is a good thing - it increases the chance that someone will think "that's weird" (as Andres Freund did when SSHing to a test box consumed a full CPU core for 0.5 seconds) and investigate, catching the malicious code.

Flatpak sandboxing

Posted Aug 28, 2024 7:45 UTC (Wed) by taladar (subscriber, #68407) [Link] (2 responses)

It always struck me as weird that the very same tools that want the developer to be in charge of packaging instead of distro packagers somehow expect those developers to restrict their own applications via sandboxing in a meaningful way.

Flatpak sandboxing

Posted Aug 28, 2024 18:11 UTC (Wed) by smcv (subscriber, #53363) [Link] (1 responses)

A Flatpak repository like Flathub is "the same shape" as a mobile app-store like Google Play or Apple's App Store: the developer packages and submits their app, including choosing what sandboxing parameters (permissions) they will ask for, but the app-store curator is in a position to say "no, we think these permissions are excessive, reduce them or we won't publish your app" if they want to. Flathub doesn't have armies of testers and developer contacts (or a large enough budget to have them), so as far as I know it isn't currently particularly proactive about enforcing review before an app update is published, but in principle it could be.

Flatpak is federated, so you don't *have* to use Flathub (you could download your apps from a different repository), but adding a repository as a source of apps does involve extending some trust to the curators of that repository - not as much as with a traditional packaging framework like apt/dpkg, but some.

The user also has the opportunity to look at what the app is asking for[1] and accept or refuse installation accordingly, again similar to the way some of the more "static" permissions are handled by mobile app-stores.

Of course, if an app legitimately needs a "powerful" permission like screen-sharing in order to do its job, then no amount of review or sandboxing can prevent it from exercising that permission in ways that you didn't want it to, so a Flatpak-style sandbox would not protect you from this particular malware. An app that is malicious or compromised can do anything that its sandbox allows, and its sandbox needs to allow everything that was necessary for it to work as designed; even if the user is asked for permission every time, a competent malware author will disguise their request for access as a request for something that is legitimately needed, for example only starting to exfiltrate screen contents when legitimate screen-sharing functionality is activated.

Also, Flatpak's approach to packaging is focused on apps, more than plugins for those apps; and there is typically no privilege boundary between an app and its plugins, because in a typical plugin architecture they're loaded into the same address space as the app itself and can arbitrarily overwrite its memory. That's another reason why Flatpak-style sandboxing would not have helped a whole lot in this scenario, where a non-malicious app (Pidgin) loads a malicious plugin and becomes compromised as a result. If there is no security boundary within the app, a sandboxing framework like Flatpak can't magic one into existence.

---

[1] assuming a high-quality UI; GNOME Software has this, KDE Discover probably does too, and the flatpak(1) CLI has it but only in a developer-oriented format

Flatpak permissions do not provide informed consent

Posted Aug 29, 2024 7:11 UTC (Thu) by ejona86 (subscriber, #43349) [Link]

The Flatpak UIs do not list all granted permissions (e.g., --talk-name=org.freedesktop.secrets is not shown) and the UI is misleading as it says "can't access X" but some other permission lets the app easily do that[1] (e.g., it says "No Network Access" but has a trivial sandbox escape). Adding permissions during upgrades is handled solely by the UI, and the UIs are inconsistent about receiving opt-in. Those are UI issues that could be improved, but...

... the user is trained to ignore the red security badge. Earlier this year I did a survey of 53 app permissions[2] and only 5 had a useful sandbox, assuming --share=network doesn't have the abstract unix socket escape. It appeared Arch may allow the trivial escape, which would mean only one app had a useful sandbox on Arch[3].

I know the portals are still improving. But it seems a serious failure to think Flatpak provides security above apt/dpkg in practice, or to think it or apps will soon improve enough to matter.

1. https://ejona.ersoft.org/archive/2024/03/02/flatpak-perms...
2. https://ejona.ersoft.org/archive/2024/03/03/flatpak-perm-...
3. Feel free to blame the apps or distros, but the problem remains

Reproducable builds

Posted Aug 27, 2024 11:27 UTC (Tue) by pizza (subscriber, #46) [Link] (10 responses)

> The fact that a pidgin plugin that you can just install off of the official site is even capable of taking screenshot and logging keys is what's actually alarming and irresponsible to me

"Taking screenshots and logging keys" is a fundamental capability of _any_ X11 application.

Reproducable builds

Posted Aug 27, 2024 12:04 UTC (Tue) by atnot (subscriber, #124910) [Link] (9 responses)

Yes, but there's nothing saying that a pidgin plugin needs to have access to the full capabilities of X by default, always, without any interaction.

A plugin system's scope should be limited to the application by default, with well defined, permissioned interfaces for interacting with the rest of the system. Loading an .so file off of the internet in user-facing application in this day and age is just reckless. Especially when those users have been trained by every other software to install things off of things that look like app stores without having to worry too much about it compromising their *entire system*.

Reproducable builds

Posted Aug 27, 2024 13:15 UTC (Tue) by pizza (subscriber, #46) [Link] (8 responses)

> Yes, but there's nothing saying that a pidgin plugin needs to have access to the full capabilities of X by default, always, without any interaction.

Either it has _full_ access to X, or it has _no_ access to X. There is no middle ground.

Meanwhile, as others have pointed out, this plugin was explicitly billed as a screen sharing tool. So even if it was technically possible to restrict access to the screen and/or input events (eg through use of Wayland instead of X), the user would have granted those permissions anyway, as that's the reason they installed the plugin in the first place.

Heck, even if this plugin's source code was provided and Pidgin compiled the binaries itself, it would have taken a careful manual audit of the source code to discover the plugin was sharing with an additional party.

Reproducable builds

Posted Aug 27, 2024 15:04 UTC (Tue) by atnot (subscriber, #124910) [Link]

> Either it has _full_ access to X, or it has _no_ access to X. There is no middle ground.

Yes. And it doesn't need access to X. Just wrap the parts you want to expose to plugins through the plugin API. Just like web browsers do, or even xdg portals on linux already do for things that aren't X.

> as others have pointed out, this plugin was explicitly billed as a screen sharing tool

Browsers are also screen sharing tools. But they have technical controls in place such that random websites in the background can't just start recording your screen. You have to authorize it every time, choose what you want to share and then you get a persistent screen element alerting you to the fact that your screen is being recorded.

In the rare case where you want it to do things outside of that sandbox you have to install a separate program that communicates with the browser, making it very obvious that you're breaching a trust boundary. Compromise through malicious browser plugins has plummeted since the removal of flash.

> even if this plugin's source code was provided and Pidgin compiled the binaries itself, it would have taken a careful manual audit of the source code to discover the plugin was sharing with an additional party.

Yes, which is why you need to set things up such that this is not straightforwardly possible to do. Pidgin had zero of the layers of swiss cheese that could have prevented this and the results are predictable.

Nested X11, sandboxing this type of malice

Posted Aug 27, 2024 15:39 UTC (Tue) by farnz (subscriber, #17727) [Link] (4 responses)

There is a middle ground, of nested X11 servers. The plugin has full access to an X11 server run by Pidgin (such as Xvfb or Xephyr), and Pidgin takes responsibility for communication between the nested X11 server and the host server, copying data back and forth as needed to make the nested server work the way the user expects.

In this case, though, (and as you point out) X11 is a red herring; what was needed was a way to restrict the plugin's access to networking (including IM networks) so that it couldn't share data it had grabbed with permission with anyone other than the people the user expected it to be shared with.

And reproducible builds is a side-show there - it's no help to anyone if the plugin reproducibly sends keylogger and screenshot data over IM or over a separate IP socket, rather than only doing it on the officially blessed version. You need either auditing of the built version to be sure it's not doing something odd, or sandboxing that results in something odd being noticed. After all, that's basically how the xz vulnerability got noticed - Andres Freund was running sshd in a sandbox setup where the CPU time taken to log in over SSH was visible to him, and trying to untangle that oddity led to the backdoor.

Nested X11, sandboxing this type of malice

Posted Aug 27, 2024 16:39 UTC (Tue) by pizza (subscriber, #46) [Link] (3 responses)

> There is a middle ground, of nested X11 servers. The plugin has full access to an X11 server run by Pidgin (such as Xvfb or Xephyr), and Pidgin takes responsibility for communication between the nested X11 server and the host server, copying data back and forth as needed to make the nested server work the way the user expects.

Which still wouldn't have accomplished anything useful given the nominal "it can't share your screen without having access to your screen" purpose of this plugin.

I suppose this is an additional sign that F/OSS applications are finally becoming meaningful targets for ne'er-'do-wells. Unfortunately most F/OSS doesn't have the (non-pittance &| sustainable) revenue streams pay for the necessary (and perperual) hardening/re-engineering efforts.

Tiny improvement, not a fix

Posted Aug 27, 2024 17:29 UTC (Tue) by farnz (subscriber, #17727) [Link] (2 responses)

It could (but practically probably wouldn't) have defanged the keylogging part, but not the screen-sharing part. Given that the plugin didn't need to access keystroke data, Pidgin could have simply not sent keystrokes into a nested X11 server. But malicious people aren't stupid, and I suspect they'd have come up with a plausible reason why they needed keystroke access (hotkey to trigger screen capture, for example), breaking the sandbox.

Tiny improvement, not a fix

Posted Aug 28, 2024 7:51 UTC (Wed) by taladar (subscriber, #68407) [Link] (1 responses)

What you would really want are time-limited permissions to capture the screen and/or keys just when there is some indication on screen that it is doing so, similar to the light on your webcam.

Screen capture in use indications

Posted Aug 28, 2024 8:04 UTC (Wed) by farnz (subscriber, #17727) [Link]

Indicators help, but again are often ignored if the application has a good enough explanation for why it needs more permission and the indicators on all the time. I already see users ignoring the "microphone in use" and "camera in use" indications on Android + iOS because something they've installed tells them to ignore those indications, and I suspect that the same would be true on Linux.

And time-limited permissions runs into the basic problem with all requests to a user for permissions; the permission is one the application needs to do its job, and thus it's relatively easy to socially engineer users into allowing the permission for longer than needed, or to allow it indefinitely, citing nebulous "bugs in the framework" that require it to be indefinite in order to do the job you're using the app for.

Wayland does help for the key-logging part

Posted Sep 24, 2024 8:22 UTC (Tue) by daenzer (subscriber, #7050) [Link] (1 responses)

> Meanwhile, as others have pointed out, this plugin was explicitly billed as a screen sharing tool. So even if it was technically possible to restrict access to the screen and/or input events (eg through use of Wayland instead of X), the user would have granted those permissions anyway, as that's the reason they installed the plugin in the first place.

For the screen sharing part, sure.

However, I don't know of any (generally-available) mechanism which would allow a Wayland client to receive keyboard input made while none of its own Wayland surfaces have input focus. (There's a portal for global shortcuts, that can't be abused for key-logging though)

Even for Wayland surfaces created by the Pidgin process, if the plugin didn't know their protocol IDs, it would be more difficult for the plugin to access them and their input via the Wayland protocol than it is via X (which makes it trivial really). Might not be impossible though.

Even via Xwayland, in a Wayland session this plugin could only log keystrokes meant for other X clients, not for native Wayland clients, let alone for compositor-internal dialogues.

I'll have to remember this example for next time someone like birdie claims "the Wayland restriction on keyboard input is pointless, there are no key-loggers taking advantage of X".

Wayland does help for the key-logging part

Posted Sep 24, 2024 8:30 UTC (Tue) by daenzer (subscriber, #7050) [Link]

Also, for screen sharing via the portal, the user has to give consent, can cancel it anytime, and the Wayland session should have some kind of indicator while screen sharing is active. It doesn't allow unlimited silent surveillance, in contrast to X.

Reproducable builds

Posted Aug 27, 2024 11:38 UTC (Tue) by docontra (guest, #153758) [Link]

A quick duck duck go search yielded that the plugin in question was for screen-sharing; not much you can do with sandboxing to prevent it from taking screenshots and sending them somewhere else.

PS: From the byline of the plugin's site (as seen in Duck Duck Go, link is dead and I didn't bother to search for it on Wayback Machine): "Encryption - You can trust that only you & your buddy will be able to see shared screen packets, thanks to LibOTRv4-powered encryption". Yeah, about that...

how it worked?

Posted Aug 27, 2024 23:11 UTC (Tue) by rcampos (subscriber, #59737) [Link]

Does anyone have any insights on how it worked?

I know the code is not there, but if anyone analyzed the binary or something, I'm very curious to know about.

Have Pidgin developers been compromised?

Posted Aug 28, 2024 5:13 UTC (Wed) by mb (subscriber, #50428) [Link]

Has a Pidgin developer had installed this plugin?


Copyright © 2024, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds