As I understand it, most adult content producers aren’t actually interested in having minors using their sites. It seems like the easiest thing to do would be to have them add some “Adult Material” flag in their metadata, and let consumers respond as they wish to that tag - whether that’s done through browser settings, router nannyware, or whatever.
Is there a technical reason this isn’t what’s being pushed for? I’m sure there’s lobbying and “optics” reasons for not doing this, but is there any practical reason for not pursuing this, or something like it?
We already have multiple solutions for blocking children from websites that parents don’t want them to access and the companies providing those situations maintain their own databases of different types of content tagged so that parents can have some control over what is blocked and what is not. This stuff has existed since the 90s it’s nothing new. It requires parents taking the initiative though and really when we get down to it this is another, "but think of the children, " sort of situation where they are using child safety as cover for making it easier to collect biometric data of people online.
The hell are you talking about? Yes there are blocklists for adult domains, but that doesn’t actually block adult content since it leaves stuff like YouTube open. If you think there isn’t full on sex on there then you’ll be surprised.
The only thing that functions right now is whitelisting and it is super annoying since so many apps open a web container inside the app. All this id verification is nonsense, but providing an actually filtered internet is still nigh impossible for parents who aren’t tech savvy.
Pretty easy solution to that, don’t let your kid have access to youtube without observing what they are watching. If a parent isn’t willing to learn how to setup parental controls and/or web filtering and take the time to observe what their child is consuming then it shouldn’t be shoved onto the government and made a problem for everyone else.
As I understand it, most adult content producers aren’t actually interested in having minors using their sites. It seems like the easiest thing to do would be to have them add some “Adult Material” flag in their metadata, and let consumers respond as they wish to that tag - whether that’s done through browser settings, router nannyware, or whatever.
Is there a technical reason this isn’t what’s being pushed for? I’m sure there’s lobbying and “optics” reasons for not doing this, but is there any practical reason for not pursuing this, or something like it?
We already have multiple solutions for blocking children from websites that parents don’t want them to access and the companies providing those situations maintain their own databases of different types of content tagged so that parents can have some control over what is blocked and what is not. This stuff has existed since the 90s it’s nothing new. It requires parents taking the initiative though and really when we get down to it this is another, "but think of the children, " sort of situation where they are using child safety as cover for making it easier to collect biometric data of people online.
The hell are you talking about? Yes there are blocklists for adult domains, but that doesn’t actually block adult content since it leaves stuff like YouTube open. If you think there isn’t full on sex on there then you’ll be surprised.
The only thing that functions right now is whitelisting and it is super annoying since so many apps open a web container inside the app. All this id verification is nonsense, but providing an actually filtered internet is still nigh impossible for parents who aren’t tech savvy.
Pretty easy solution to that, don’t let your kid have access to youtube without observing what they are watching. If a parent isn’t willing to learn how to setup parental controls and/or web filtering and take the time to observe what their child is consuming then it shouldn’t be shoved onto the government and made a problem for everyone else.