Schneier thinks that this is really about media companies wanting to control consumer use of video, audio, and text. Perhaps, but let's leave the moral and ethical issues to others for the moment. There are plenty of considerable problems that face tech companies that enable such schemes.
Once we go down this path -- giving one device authority over other devices -- the security problems start piling up. Who has the authority to limit functionality of my devices, and how do they get that authority? What prevents them from abusing that power? Do I get the ability to override their limitations? In what circumstances, and how? Can they override my override?How do we prevent this from being abused? Can a burglar, for example, enforce a "no photography" rule and prevent security cameras from working? Can the police enforce the same rule to avoid another Rodney King incident? Do the police get "superuser" devices that cannot be limited, and do they get "supercontroller" devices that can limit anything? How do we ensure that only they get them, and what do we do when the devices inevitably fall into the wrong hands?Keeping all that in mind, what kind of potential liability do companies now face? When a consumer's home is burgled, or someone beaten by the police -- or a business user loses thousands of dollars in work when a device is ordered to shut down at an inopportune moment -- who do you think is going to be named in the inevitable lawsuit? It will be the manufacturer that failed to "adequately warn" said consumer or user of the dangers, not just in general, but for each and every specific situation that the person did not anticipate and should not have been expected to anticipate. And then the shareholders and directors are going to demand an answer to why corporate management didn't anticipate the liability problem when it cooked up the scheme in the first place.