Supporters of security and privacy are participating in another difficult battle Against the recoveryAI tools launched in Windows 11 will take screenshots, index and store everything that users do after three seconds.
When Recall was introduced In May 2024, security practitioners ended it to create a gold mine for insiders, criminals or toxic national spies if they manage to get short administrative access to Windows devices. Private supporters warned that the recovery was ripe for abuse in intimate partnerships. They also noted that there was nothing to prevent the recovery from keeping the sensitive disappearing content sent through privacy protective messengers as signals.
Total recovery
After many months of backlash, Microsoft then suspended the recovery. On Thursday, the company speak It was re -revoked. It is only available for insiders who have access to the preview version of Windows 11 Build 26100.3902. Over time, the feature will be wider. Microsoft officials wrote:
Microsoft is hoping that the concessions requested to choose to participate and the ability to suspend recovery will help quell the collective rebellion that broke out last year. It may not be for many reasons.
First, even if users A never chose to recall, they do not control the installation on the user B to Z. That means whatever user A sent to them will be screenshot, handled with optical and COPIT AI character identification and then stored in the database is indexed on other devices of other users. That will indiscriminate all types of sensitive documents of users A, including photos, passwords, medical conditions and videos and encrypted messages. EQUAL Private guide My writer Written on Mastodon:
The presence of an easy -to -find database to easily grasp every time of waking up will also be a Bonanza for others who do not have the best benefit of users. The degree of detailed archives will certainly have to follow the plaster of the lawyers and the government. Actors threaten management to install their spyware on a device will no longer have to scan it for the most sensitive data stored there. Instead, they will exploit it as soon as they make a password storage database right now.
Microsoft did not respond immediately a message asking why it introduced less than a year after this feature had such a cold reception. For critics, the recovery is likely to be one of the most dangerous examples of ResponseThe term has been cast recently for unwanted shoe polish and other features on existing products that sometimes have negligible benefits for users.
This story initially appeared on Ars Technica.