In an unsettling examine revealed immediately, researchers from Duke University approached 12 knowledge brokers and bought hundreds of data about American service members with minimal vetting.
The examine highlights the acute privateness and nationwide safety dangers created by knowledge brokers. These firms are a part of a shadowy multibillion-dollar business that collects, aggregates, buys, and sells knowledge, practices which are presently authorized within the US, exacerbating the erosion of private and shopper privateness. Read the full story.
—Tate Ryan-Mosley
The inside scoop on watermarking and content material authentication
Last week, President Biden released his executive order on AI, a sweeping algorithm and tips designed to enhance AI security and safety. The order put nice emphasis on watermarking and content material authentication instruments, which purpose to label content material to find out whether or not it was made by a machine or a human. The White House is making a giant guess on these strategies as a strategy to battle AI-generated misinformation.
The White House is encouraging tech firms to create new instruments to assist shoppers discern if audio and visible content material is AI-generated, and plans to work with the group behind the open-source web protocol often called the Coalition for Content Provenance and Authenticity, or C2PA. Tate Ryan-Mosley, our senior tech coverage reporter, has written a useful information to C2PA, what it could actually obtain, and, crucially, what it could actually’t. Read the full story.
This story is from The Technocrat, our weekly publication overlaying tech and politics Sign up to obtain it in your inbox each Friday.