General observations on surveillance and accrual of police powers.
You can call me Lazy Susan.
Short read, but very illustrative. It seems weird that courts even bother with the "expectation of privacy" charade anymore, since the answer is always "no."
If you want to be protected by the need for a warrant, you'll need to pay your 47 cents.
You almost have to live here to get the whole mccrory business and how his particular species of brain worms makes sense to anyone.
This is a state where your professional coworkers will unironically say crap like "magic black unicorn" (the preferred nomenclature for Obama) and sincerely believe that natural disasters are from homosexuals fornicating.
There really are a huge number of folks here that would clearly benefit from some time and self reflection in a FEMA re education camp in an abandoned Walmart.
You almost have to live here to get the whole mccrory business and how his particular species of brain worms makes sense to anyone.
This is a state where your professional coworkers will unironically say crap like "magic black unicorn" (the preferred nomenclature for Obama that they can say in public) and sincerely believe that natural disasters are from homosexuals fornicating.
There really are a huge number of folks here that would clearly benefit from some time and self reflection in a FEMA re education camp in an abandoned Walmart.
FTFY.
Working as intended.
Yep.
The State of California is working on a couple of resolutions to create a database firewall between themselves and the Federal Government. The EFF is helping craft the legislation. Basically they want to prevent the creation of class-based databases, and prevent unwarranted sharing of personal data. Could get very interesting.
https://www.eff.org/deeplinks/2017/0...
Senate Bill 54, authored by Senate President Pro Tempore Kevin de León would prevent law enforcement agencies in California from sharing department databases or private information with the federal government for immigration enforcement. It would also require California state agencies to update their confidentiality polices so that they stop collecting or sharing unnecessary data about every Californian.Senate Bill 31, authored by Sen. Ricardo Lara, would prevent local and state government agencies from collecting data, sharing data, or using resources to participate in any program that would create a registry of people based on their religion, ethnicity, or national origin. Police agencies would also be forbidden from creating a database of religious minorities in California.
That's making the assumption that CA can protect their data (they can't) against the feds and more importantly the NSA or even go create their own with all their current methodologies they already have.
With technological advances, though, you need to replace "okay" with "necessary", when you don't control the actions of that someone else. Nuclear weapons are not "okay", but we'd be in a bad place today without them...
With technological advances, though, you need to replace "okay" with "necessary", when you don't control the actions of that someone else. Nuclear weapons are not "okay", but we'd be in a bad place today without them...
That seems the slipperiest of slippery slopes. Hitting back first is pretty hard to justify, regardless of what you're hitting with.
EDIT: Especially with things like nukes. The US developed them first. To say that it's important that you have them because other countries ("bad dudes" countries) have them is like retaliating when someone defends themselves.
That's making the assumption that CA can protect their data (they can't) against the feds and more importantly the NSA or even go create their own with all their current methodologies they already have.
It's harder for the feds to collect the data if the states don't do it for them in the first place, which is the point of one of the bills.
In "A Good American," a new documentary that goes into widespread release today, director Freidrich Moser tells Binney's story from his early days as an intelligence analyst during the Vietnam War to his service as a codebreaker during the Cold War to his visionary program for conducting electronic surveillance with an emphasis on privacy and the rule of law. Binney and his fellow whistleblowers tell the story of how General Michael Hayden, then head of the NSA, sidelined their proposals in favor of a multibillion-dollar boondoggle called Trailblazer, which collapsed without ever shipping -- and how Hayden and his team refused to allow NSA analysts to work in the wake of the 9/11 attacks, literally locking them out of the building while they plotted ways to shift the blame for intelligence failures and use the attacks to build private, well-funded permanent civil service empires.
Robear wrote:
With technological advances, though, you need to replace "okay" with "necessary", when you don't control the actions of that someone else. Nuclear weapons are not "okay", but we'd be in a bad place today without them...That seems the slipperiest of slippery slopes. Hitting back first is pretty hard to justify, regardless of what you're hitting with.
That's not my point, though, I'm not suggesting we justify attacking first with new orbital space lasers just because we have them. What I'm saying is that once *any* significant new technology is widely available - and that can come soon enough after it is known to be possible - then nations that are trying to maintain their position in the world are forced into implementing it. This in turn can change the nature of many related things. We've seen military technology change warfare significantly over the last 200 years; we're seeing surveillance and communications tech change the nature of privacy to the point where we will likely need to re-evaluate our expectations and accommodations (like using crypto) not just socially, but under the law. And that's one way that societies change over time.
Chumpy McChump wrote:Robear wrote:
With technological advances, though, you need to replace "okay" with "necessary", when you don't control the actions of that someone else. Nuclear weapons are not "okay", but we'd be in a bad place today without them...That seems the slipperiest of slippery slopes. Hitting back first is pretty hard to justify, regardless of what you're hitting with.
That's not my point, though, I'm not suggesting we justify attacking first with new orbital space lasers just because we have them.
I'm suggesting that being the first to develop orbital space lasers is very similar to attacking first. By advancing the tech, by your own suggestion it "forces" other players to do the same. Considering the US's history of making the tech that everybody "needs", I don't think you get to play the "we'd be in a bad place today without them" card as justification.
So, what's your proposal to deal with new technologies, developed here or elsewhere? Do we suppress them somehow? Make the Patent Office top secret? Who decides what's dangerous, and what's not? How, exactly, do we clamp down that hard on research?
Yes. When new tech comes along, many other countries and companies and people want it. What you're suggesting is that we somehow filter that deluge to prevent... what? Look at telephones. They are ubiquitous. But they are also the source of *billions* of dollars of losses every year through app fraud and identity theft alone. They are platforms for the mass surveillance of, potentially, everyone in the world.
What should we do about phones? Ban them? Remove them? Clearly, there is a lot more going on than you've implied. So I'm curious about how you intend to prevent the potential harms caused by new technologies.
What should we do about phones? Ban them? Remove them? Clearly, there is a lot more going on than you've implied. So I'm curious about how you intend to prevent the potential harms caused by new technologies.
All you can do is wait for the law to catch up to new technologies, something that takes far longer than some people are comfortable with.
Come now, Robear. You start by justifying the US having nuclear weapons because other actors have them (and never mind that the US developed them in the first place), and when pressed, talk about banning phones?
Remember where the conversation started.
"Someone else will do it, so that means it's okay for us to do it" doesn't strike me as especially valid argumentation.
and you followed with
With technological advances, though, you need to replace "okay" with "necessary", when you don't control the actions of that someone else. Nuclear weapons are not "okay", but we'd be in a bad place today without them...
You're arguing for pre-emptive escalation, and using a situation created by said escalation to justify it. There's a fallacy in there somewhere, but I can't remember it.
Oh, and "Who decides what's dangerous and what's not?" ... I agree that often it's a gray area. I don't think you could put nukes in that category.
I'm *not* arguing for pre-emptive escalation. I'm saying that when technologies exist, they will be implemented. There's a difference.
I've brought this point up before, in the same way. I don't argue that "if you're going to do it, do it bigger than anyone else". I'm simply noting that any idea that countries will ignore technologies that they can afford to implement, that affect their security in the world, and that other countries are implementing, is wrong. We can't just say "this technology sucks, let's just not instantiate it". Especially in the world of surveillance technology.
We need to deal with the technologies in the law, and in treaties. We can't stuff the genies back in their bottles. That's not arguing for pre-emptive escalations of technologies, it's acknowledging reality.
I'm *not* arguing for pre-emptive escalation. I'm saying that when technologies exist, they will be implemented. There's a difference.
I've brought this point up before, in the same way. I don't argue that "if you're going to do it, do it bigger than anyone else". I'm simply noting that any idea that countries will ignore technologies that they can afford to implement, that affect their security in the world, and that other countries are implementing, is wrong. We can't just say "this technology sucks, let's just not instantiate it". Especially in the world of surveillance technology.
There's also a substantial difference between saying, "They have this, we need to build something to counter/neutralize it." and saying, "They have this, so we need it too. (ie, if they're going to do it to us, we're going to do it to them)".
The whole point of the United States is that we're not supposed to be like them.
The whole history of the United States is that we're like them.
Malor wrote:The whole point of the United States is that we're not supposed to be like them.
The whole history of the United States is that we're like them.
We're not supposed to be, though.
It's the national myth, of course, carefully nurtured over the past couple of centuries. From the early 19th century's lionizing of pioneers and founding fathers to the anti-communist land-of-the-free of the mid-20th, the United States has an identity that's all about American exceptionalism, even as the actual history has regularly featured genocide, slavery, and paper-thin excuses to invade countries that never attacked us, like Cuba.
There is no justification for mass surveillance of one's own citizens.
That's ridiculous and you know it! There are probably thousands of reasons for running surveillance of one's own citizens, but the big ones are to prevent rebellion, prevent crimes, seek out dissidents, and to find political enemies. Granted, in the US we have the Bill of Rights that is supposed to make that illegal, and in theory impossible, but that doesn't mean there aren't good reasons to do it.
This whole topic is the one thing I miss about Justice Scalia. For all of his disgusting intentional misreading of several Constitutional passages, he was just about the only person in the government who seemed to be willing to be outspoken about the perils of increased surveillance and invasion of privacy.
but the big ones are to prevent rebellion, prevent crimes, seek out dissidents, and to find political enemies.
Looking back through history dissent and rebellion have frequently been social goods. If you can get rid of them a lot of social progress will stall.
Atras wrote:but the big ones are to prevent rebellion, prevent crimes, seek out dissidents, and to find political enemies.
Looking back through history dissent and rebellion have frequently been social goods. If you can get rid of them a lot of social progress will stall.
I don't think that Atras' point was that the end-goal that justifies mass surveillance is social progress.
Pages