Apple is reportedly developing a tool that would scan for child sexual abuse material (CSAM) in your iPhone photos using hashing algorithms. The system is said to be deployed on the user’s device for greater security and privacy.
from Gadgets 360 https://ift.tt/3Cn1zQa
Subscribe to:
Post Comments (Atom)
Recent Slider
5/Tech/feat-slider
Photography
3/Tech/post-per-tag
Categories |
No comments:
Post a Comment