West: Post-Spectre web development
West: Post-Spectre web development
Posted Feb 27, 2021 14:05 UTC (Sat) by jgg (subscriber, #55211)Parent article: West: Post-Spectre web development
In a sandbox like a web browser, isn't it reasonable to just block access to this high resolution data? Eg by limiting time resolution, blocking performance counting and/or adding randomness.
Posted Feb 27, 2021 14:22 UTC (Sat)
by Paf (subscriber, #91811)
[Link]
They can just do a statistical analysis of response times on whatever they gin up, and gradually extract data from that. Any randomness added would have to be added to *every* interaction and if it didn’t change, could presumably be puzzled out. Maybe it’s possible to do ... something ... with that and a CSPRNG? Eek. That seems fraught.
Posted Feb 27, 2021 14:59 UTC (Sat)
by Cyberax (✭ supporter ✭, #52523)
[Link] (1 responses)
The only way to defeat this is to disable shared-memory multithreading for JS.
Posted Feb 28, 2021 3:10 UTC (Sun)
by excors (subscriber, #95769)
[Link]
That's why browsers disabled SharedArrayBuffer (the main API for sharing memory between JS threads (workers)) along with the high-res timer APIs when Spectre came out, until they could implement mitigations by running each site in a separate process (when enabled with certain HTTP headers) and relying on address space isolation to prevent Spectre-like attacks stealing data from other sites, so now SharedArrayBuffer and high-res timers can be used again but only by sites that opt in to running in isolated processes.
West: Post-Spectre web development
West: Post-Spectre web development
You can create a high-res timer by running a thread that increments a variable and observing it from another thread.
West: Post-Spectre web development
