You must log in or register to comment.
ew probably still censored.
You can self host it right??
if the model is censored… then what, retraining it? Or doing it from scratch like what open-r1 is doing?
The self hosted model has hard coded censored content.
The censorship only exists on the version they host, which is fair enough. If they’re running it themselves in China, they can’t just break the law.
If you run it yourself, the censorship isn’t there.
Untrue, I downloaded the vanilla version and it’s hardcoded in.
Yeah, i think the censoring in the LLM data itself would be pretty vulnerable to circumvention.