Learn languages naturally with fresh, real content!

tap to translate recording

Explore By Region

flag News sites use "robots.txt" to block Apple from scraping their sites for AI training data.

Major platforms like Facebook and the New York Times are using a file called "robots.txt" to block Apple from scraping their websites for AI training purposes. Apple has been offering publishers millions of dollars to scrape their sites, with the aim of using the data for AI training. The robots.txt file, which allows site owners to indicate they do not want their sites scraped, demonstrates that many news sites are opting to block Apple Intelligence from accessing their content.

130 Articles