Keep track of Berkman-related news and conversations by subscribing to this page using your RSS feed reader. This aggregation of blogs relating to the Berkman Center does not necessarily represent the views of the Berkman Center or Harvard University but is provided as a convenient starting point for those who wish to explore the people and projects in Berkman's orbit. As this is a global exercise, times are in UTC.
The list of blogs being aggregated here can be found at the bottom of this page.
Harvard needs to be as honest with itself as MIT is being. In some ways the tragedies are disproportionate; the Gov 1310 mess has cost no lives that I know of, though I understand that some students were under severe psychological stress, lost weight, and so on. On the other hand, it seems that many dozens will have their lives permanently altered by the experience and the black mark that goes with it on their transcript.
From my conversations with families and students involved in the Gov 1310 case, what strikes me is how un-family-like they feel their interactions with the university have been. Harvard's disciplinary process is meant to be paternalistic; to be sure parents must sometime discipline their children while still loving them. There is not much sense out there among the Gov 1310 students I know that Harvard loves them.
Good for MIT for recognizing that its reputation will, in the long run, be enhanced if it tries to figure out if and where it went wrong with Aaron Swartz, and that the best way to do that, in a family setting, is to ask a wise uncle to figure it out and report to the community. I wish Harvard had the same attitude.The Abelson report is now out. (Here is a link to it.) It is an extremely honorable and good report. It very carefully reconstructs the timeline, documenting insofar as possible every relevant conversation and decision that was taken. That was difficult to do, because, for example, both MIT and Swartz had outside counsel (Swartz had several different ones at different times). Figuring out whether and why some party at MIT was or was not included in a conversation about which lawyer is talking to which state or federal official must have been a nightmare (lawyers typically are not the most open folks about their back office conversations).
(2) review the context of these decisions and the options that MIT considered, and (3) identify the issues that warrant further analysis in order to learn from these events.
More generally, has MIT become overly conservative in its institutional decision-making around these incidents? More than once in our interviews, the Review Panel heard members of the MIT community express a feeling that there has been a change in the institutional climate over recent years, where decisions have become driven more by a concern for minimizing risk than by strong affirmation of MIT values. Several people interpreted the Institute’s response in the Swartz case in that light. And some critics have chided MIT for playing such a passive role when Swartz’s actions were motivated by principles that MIT itself champions. Yet we think it is important to view this tragedy in light of a history that may not conform with a myth of a golden past. For this reason we have referred repeatedly to some prior experiences.
One distinguished alumnus said to us, “MIT seemed to be operating according to the letter of the law, but not according to the letter of the heart,” even while he expressed his enormous respect for the MIT leaders who had to grapple with these decisions. Is his concern on target? MIT aspires to be passionate about its principles, but we must also behave prudently as an institution. Of all the decisions MIT’s leadership must make, those that require negotiating a balance between prudence and passion are some of the most wrenching. How can we make those choices easier to confront?
MIT celebrates hacker culture. Our admissions tours and first-year orientation salute a culture of creative disobedience where students are encouraged to explore secret corners of the campus, commit good-spirited acts of vandalism within informal but broadly— although not fully—understood rules, and resist restrictions that seem arbitrary or capricious. We attract students who are driven not just to be creative, but also to explore in ways that test boundaries and challenge positions of power.That is a wonderful statement about one aspect of MIT's soul. MIT knows what its soul is, or at least the Abelson committee knows. What would a comparable statement about Harvard say? Does Harvard even have a soul? Leave aside the graduate schools. Does Harvard College have a soul? Has any president, provost, dean, or committee chair talked about it lately?
The Swartz and LaMacchia cases were separated by 20 years. Insofar as the Review Team has been able to determine, despite the similarities in the two cases, there was no institutional memory inside of MIT spanning these 20 years, and so no mention of David LaMacchia as a precedent in any deliberations around Aaron Swartz by the MIT administration or the Office of the General Counsel, and no discussion of MIT’s attitude towards a charge of unauthorized access(LaMacchia was a student who set up a peer-to-pper file sharing service on an MIT public workstation.)
Are we becoming a place that, in the words of legal scholar James Boyle, “confuses order with rectitude”?
by Harry Lewis (noreply@blogger.com) at August 02, 2013 02:03 AM

Blue Gardens, the new release on Keysound Recordings, is the brainchild of a new musical talent who goes by the name E.m.m.a. Simply put, it’s some of the best music — and one of the more coherent albums and promising debuts — I’ve heard in years.
My fave tracks are ones like “Shoot the Curl” or “Marina” with their sweetly curdling tunes and clicky soca, where melodies accrue & break & creak & whirr against a solid but shifting Carib-UK backbeat.
Honorable mention to the first single and the album’s one vocal cut, “Jahovia.” It’s great to hear Rebel MC riding the rhythm inna a classic reggae-rave fashion while E.m.m.a.’s newfangled textures let you know this is something else. Wicked video too, since somehow, synaesthetically speaking, the music already sounds like washed-out technicolor reels, full of sharp hue-turns –
Plus, you gotta love the array of influences E.m.m.a. pulls into the mix —
“American Nostalgia, Point Break, American high schools, bubble gum, picture houses, Coney Island, Hollywood, proms, Long Island, picket fences, boardwalks, Baroque tonality, Wendy Carlos, Delia Derbyshire, Jeff Wayne, Westerns, sci fi, spaghetti western soundtracks, Encarta ’96: genuinely these are in my mind,” she explains. “I just think the idea of the monopoly the Encarta encyclopaedia had on knowledge is ridiculous in the context of the present day. I’m not ashamed to say it’s my muse.”
And while I really should work to find more words to describe this music I’ve been enjoying so much, I like how it speaks for itself. And how E.m.m.a. does too. Plus, Joe Muggs sorta said it all already –
In essence, E.m.m.a. takes the rugged grooves of UK pirate radio, and adds an extra layer of melody and harmony which connects them back into the much longer history of synthesizer music as well as into a broader, vaguer realm of the imagination. Her synth timbres touch on sci-fi kitsch, Kraftwerk, early video games, the experimental home-listening techno of the 1990s, while the melodies they play have a suspended, dissipating quality, as if caught from a dream just at the point of waking.
Her beats, too, have an uncanny quality, generally touching on several points in UK soundsystem history – garage, rave, grime, the 2008-10 urban house sound of “UK funky”, the more undefinable sounds of “post-dubstep” – sounding familiar but not quite placeable in time. Despite the oddness of this, and despite its clear scholarliness in its sourcing of underground sounds, it’s a welcoming album, one which should be heard well beyond the usual circle of bass music fans. A haunting dream but one well worth getting caught up in.
I’d like to leave it there, but I want to take the opportunity to let this post stand as a long overdue bigup for Martin Blackdown’s Keysound label, consistently representing as it reimagines the sound of London. His program with Dusk on RINSE is, rightly, an international fave. I’ve had a few posts about their music & label sitting in my draft folder for literally years; and I’m remiss for not better publicly registering my enthusiasm for projects like Margins Music (which is, IMO, a 21st century London classic).
It may be high time to dust those drafts off. But I couldn’t resist the opportunity to get the good word about Keysound’s latest & greatest. Martin & Emma both bring big ears to what they do, and mine are grateful for it. Yours will be too.
On June 25, 2013, the Opinion of the Advocate General Niilo Jääskinen (AG) in case C-131/12, Google Spain v. Agencia Española de Protección de Datos, was published. This case, which is pending at the Court of Justice of the European Union (CJEU), is being closely watched because one of the questions presented to the court is about the right to be forgotten by search engines. This question implicates the proper balance of freedom of expression and protection of personal data and privacy under EU law.
The case is also interesting because it is the first time that the CJEU is asked to interpret the 1995 Data Protection Directive vis-à-vis search engines. When the CJEU finally reaches a decision in this case, it will be binding not only in the Spanish Courts, but in all the national courts of the 28 Member States of the European Union.
Facts of the Case
In 1998, a Spanish newspaper published, both off-line and online, information about a court-ordered foreclosure auction to pay social security debt. In 2009, the debtor, who had since paid his debt, discovered that ‘googling' his name led to a link to the online notice.
He asked the newspaper to take the information down, but the editor refused as the publication had originally been made by order of the Ministry of Labor and Social Affairs. He then asked Google Spain to stop referencing the link in its search results and also complained to Spain's Data Protection Authority, the Agencia Española de Protección de Datos (AEPD).
The AEPD asked Google to stop indexing the link, but refused to ask the newspaper editor to take the original information down, as the publication was legally justified. Google appealed, and on March 9, 2012, the Audiencia Nacional of Spain issued a reference for a preliminary ruling to the CJEU. This process allows judges from the Member States to ask the CJEU for an advisory opinion as to how they should apply particular EU laws.
Three Questions Asked to the CJEU
There were three main categories of questions referred to the CJEU. One was about the territorial application of the EU Data Protection Directive, the second was whether a search engine should be considered a data processor by the 1995 Data Protection Directive, and the third was about whether there is a right to be forgotten by a search engine.
1. The Territorial Scope of the Data Protection Directive
Under article 4.1 of the Data Protection Directive, the Directive is applicable to a "data controller" that processes data in the territory of a Member State or uses equipment situated in the territory of a Member State, even if the controller is not established there.
Google's business model is based on keyword advertising, and as such, processes data linked to selling targeted advertisements to people living in the EU. It has subsidiaries in several Member States. If these subsidiaries are acting as a bridge for the referencing service to the advertising market, they are establishments within the meaning of article 4(1) of the Directive. Not much surprise here.
2. The Liability of Search Engines
A more interesting question was the applicability of the Data Protection Directive to a search engine. There is no doubt, under the 2003 CJEU Lindquist case, that the publisher of web pages containing personal data is a data controller. But should a search engine also be considered a data controller?
According to the Advocate General, search engine activities are indeed personal data processing (at 75). However, Google should not be considered a "controller" under the definition of article 2(d) of the Directive, which defines the controller as the person who "determines the purposes and means of the processing of personal data." That is, search engine providers only supply the tools used to locate information, but do not exercise control over personal data. They cannot distinguish personal data from non-personal data, and cannot change information on host servers. Rather, they crawl the web to retrieve and copy web pages in order to index them; this is a "passive relationship to electronically stored or transmitted content"(at 87). Accordingly, the AG concluded that a search engine is generally not a data controller (at 89).
However, according to the AG, a search engine provider can be considered a controller when it exercises discretionary control over the search engine's index (as opposed to automatically caching content from other web pages) (at 91-92). For instance, a search engine sometimes blocks certain search results or does not display some URL addresses. A search engine might also be considered a controller when it deliberately caches content despite an exclusion code on a web page, or refuses to update the cached version of a web page at the website's request (at 93). In these circumstances, it has to comply with article 6(c) and 6(d) of the Directive stating the principles of adequacy, relevancy, proportionality, accuracy and completeness when processing such data (at 94-98).
The Spanish data subject had asked Google to suppress his name from its index. That was not a complaint about Google's exercise of control, but about automatically cached content. The AG concludes that even if a search engine, such as Google, actually processes some personal data, it generally is not a controller under the Data Protection Directive. Therefore a national data protection authority cannot ask a search engine to take down personal data, unless the search engine posted that data notwithstanding exclusion codes or ignored a request by a website to update its cache (at 99-100). The AG deferred on the issue of whether a "notice and takedown" procedure for illegal or inappropriate content might apply to search engines, finding that to be a question of national law based on grounds unrelated to the protection of private information (at 99).
3. Right to be Forgotten
The last of the three questions referred to the CJEU was whether there exists a right to be forgotten, although the answer to that question under the Data Protection Directive would be relevant only in the limited circumstances where a search engine could be considered a data controller. Nevertheless, the AG considered whether either the Directive or the Charter of Fundamental Rights of the EU gave rise to such a right.
Article 12(b) of the Directive provides the data subject with a right to rectify, erase or block processing of incomplete or inaccurate data. Its article 14(a) provides the subject with the right to object to the processing of his data. The Spanish court asked if these rights include the right to ask a search engine provider to prevent indexing personal information, even lawfully published data, giving data subjects a ‘right to be forgotten.'
According to the AG, article 12(b) only applies to incomplete or inaccurate information. In this case, the information had been lawfully published, which also makes article 14(a) inapplicable in this case. The AG is of the opinion that "the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests" (at 108).
The AG then considered whether denying a right to be forgotten is compatible with the 2000 Charter of Fundamental Rights of the EU, which sets forth all the civil, political, economic and social rights of European citizens and all persons residing in the EU.
Among these rights is the protection of personal data which must be, according to the Charter's article 8.2, "processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law." This does not bring any new elements according to the AG.
But article 7 of the Charter also claims that "[e]veryone has the right to respect for his or her private and family life, home and communications," a right which is also protected by article 8 of the European Convention of Human Rights. Under CJEU case law, the right to private life with regard to processing of personal data covers all information relating to an individual, in his private and in his professional sphere (the AG quoted the 2010 Volker und Markus Schecker case).
As Google indeed processes personal data, there is an interference with the right to privacy stated by the article 7, and thus, to be legal, this interference must be based on law and necessary in a democratic society under both the Charter and the European Convention of Human Rights.
But the Charter, in its article 11, and the European Convention of Human Rights, in its article 10, also protect freedom of expression and information. Internet users have the right to seek and receive information on the Web; this is even stated to be "one of the most important ways to exercise [the] fundamental right to receive information" (at 131). That right would be compromised if the results would be somewhat sanitized (the AG used the term ‘bowdlerized,' referring to Thomas Bowdler who prepared an ‘appropriate' version of Shakespeare in the 19th Century.)
The AG quoted the 2010 case of Aleksey Ovchinikov v. Russia where the European Court of Human Rights found that "in certain circumstances a restriction on reproducing information that has already entered the public domain may be justified, for example to prevent further airing of the details of an individual's private life which do not come within the scope of any political or public debate on a matter of general importance."
However, the right to private life must be balanced with freedom of expression and freedom of information. Giving data subjects a right to be forgotten "would entail sacrificing pivotal rights such as freedom of expression and information" and should not even be considered on a case-by-case basis, such as a notice and take down procedure. That would lead, according to the AG, to either automatic withdrawal of links or an unmanageable number of take-down requests (at 133). It would also lead to a censuring of content published on the Web, with the search engine playing the uncomfortable role of censor.
Notably, the Advocate General started his opinion quoting the 1890 Warren and Brandeis article "The Right to Privacy." His opinion that there should be no right to be forgotten that allows individuals to require search results to be stripped from one's personal data is indeed in line with the general U.S. opinion on the subject.
Conclusion
The opinion only addressed the issue of the right to be forgotten by a search engine, which is considered an intermediary by the Directive, and not the original publisher. By using ‘exclusion codes' directing search engines not to index or display in the results a particular a web page, an online publisher can effectively minimize the effect of publishing information online, or chose to take it down entirely if requested to do so by the data subject.
This ‘right to be forgotten' is hotly debated right now on both sides of the Atlantic, after the EU Commission published in January 2012 a Proposal for a data protection Regulation, which would be directly applicable in all the Member States. Under article 16 of the Proposal, the data subject would have the right to have the data controller delete personal data relating to her and to stop further dissemination of such data. It remains to be seen when and if this article will become law in the EU. However, if there is no a right to be forgotten, all we ever do, both online and off-line, may remain recorded on the Web, in perpetuity.
Marie-Andrée Weiss is a solo attorney admitted in New York, and her admission is pending in France. Her practice focuses on intellectual property, privacy, and social media law. She frequently writes on these topics and on European Union law.
(Image courtesy of Flickr user fake is the new real pursuant to a Creative Commons CC BY-NC-SA 2.0 license.)
Now, as part of the settlement, the school district has agreed to treat the child as a boy. Thus does an entire institution find itself compelled to accept the cultural left’s moral categories and priorities. This is why the Times labels transgender “the next civil rights frontier.” There’s always one, isn’t there?
This is from Ron Dreher’s post at The American Conservative about “Progressivism’s Next Battle.”
But what interests me is his comment, “There’s always one, isn’t there?” You can practically hear the sigh.
Well, yes, Ron, there is always one. Progressives are progressive because we believe in progress, and we believe in progress because — generalizing, of course — we believe three basic things.
First, human understanding is conditioned by history, culture, language. We are products of our times.
Second, our understanding tends towards some serious errors. For example, we tend to prefer the company of — and to trust — people who are like us. Worse, we go seriously wrong in judging the relevant ways people are like us, giving far too much weight to differences that make no real difference.
Third, we humans are capable of learning. When it comes to policies and institutions, the great lesson that we keep learning and need to keep learning is that few of the differences actually matter. Put positively, we need to keep learning that people are actually more like us than we thought. The great progressive impulse is to find more and more common humanity, and to adjust our policies around that truth. (And, as an aside that I both believe, and I hope Ron Dreher will find annoying: Nope, it doesn’t end with humans. We need to stop torturing and killing animals because we like the way they taste.)
So, yes, there always is a next frontier. But it’s not because progressives are sneaky land grabbers who are never satisfied. It’s because we are committed to the endless process of discovering our common humanity, and thus becoming fully human.
I’m ok with that.
Houve um tempo em que se começou a estudar o que seria a segunda tela da TV. controles remotos cheios de onda, a partir dos quais a gente iria interagir com TV digital, uma TV diferente das que a gente tinha antes, com aplicações, internet dentro da TV? não rolou, como se sabe. seriam smartphones e tablets, usados em conjunto com a TV, para ações associadas ao que está acontecendo na TV?… é, até que levava jeito. mas está se tornando cada vez mais claro é que a TV é a segunda tela. sim, a TV, ela própria, rainha da sala por décadas, está a caminho da periferia da visão e atenção de todo mundo.
o estudo connectMedia do ibope mostra que 30% das pessoas tem a TV ligada enquanto está na internet. entre os mais jovens deve ser muito mais de 50%, taxa que já era verdadeira há anos, em um estudo anterior. como você não pode só “ver” –no sentido de estar ligada e você ali, na periferia dela- a web, você usa a web, tem que se envolver com ela, digitar, curtir, teclar, agir… enquanto vê TV. a TV, devagar e sempre, está indo para a periferia da sua atenção, pois sua turma, você, seus amigos, conhecidos e quem vocês todos indicam, apontam, curtem, são seus interesses de verdade, no primeiro plano.
enquanto o cenário envolvia a TV se preocupar com a segunda tela e ver como fazia uso dela, até que não estava mal para a mídia de massa, o ecossistema onde comunicação, com edição e emissão centrais, é a fundação para a disseminação de informação. mas, quando o ecossistema entra em transição para conectividade para interação, quando as pessoas redescobrem o interesse de uns pelos outros, sem mediação nenhuma, o tal do meio deixa de ser a mensagem e a tela da TV passa para o segundo plano, de uma vez.
grandes mudanças à vista. em breve. e o brasil nem banda larga de verdade tem, ainda. imagina quando os 56% da população que têm acesso à internet forem aí pelos 80% e metade destes tiver pelo menos 10 megabit/s efetivos, seja lá de onde se ligarem. a briga pela audiência da TV será, então, pelo campeão de audiência da segunda tela. nada será como dantes. cadeias de valor e mercado estabelecidos há décadas vão mudar, e muito. por isso que publicis e omnicom se juntaram, mas essa é outra parte da mesma história e um dia a gente fala dela…
Online dating is a bad idea for teens—especially young teens.
That’s why it wasn’t particularly responsible of Seventeen Magazine to publish a blog in which “dating blogger” Isabelle Furth floated the idea of using sites like Match.com to find dates. To be fair, she had concerns about the idea, and she’s in college, so theoretically old enough to make these decisions. But college kids don’t read Seventeen. Middle school students do. And middle school students are remarkably impressionable.
However, if our only response to this blog is outrage (like the comment that Seventeen gave cyber-stalkers a gift-wrapped present), we miss the point—and some important opportunities.
The reality of the world our children are growing up in is that they are going to meet people online. Don’t get me wrong; teens don’t belong on online dating sites. As they enter the world of dating, it should be with people they know in a real world context, not a cyber-world context. They—and their parents—should know more about their dates than what you can find out from the Internet.
But online dating sites aren’t the only place that that people—and youth—meet online. They meet on all sorts of social media sites and platforms. As all of us, our children included, start communicating more and more on social media, we run into strangers. Most of those strangers aren’t dangerous. Some of those strangers become friends.
I’ve met some wonderful people on social media, people who have taught me and supported me and made me laugh, people who have helped me be a better doctor, parent and person. Granted, I’m a grownup and have a bit more judgment than a teen when it comes to trusting people online. But our children will be grownups one day, and if they don’t have the skills they need to navigate the world of online relationships, they will run into trouble. Manti Te’o’s 2-year love affair with a nonexistent person is a great example.
But even before they are grownups, social media offers youth the opportunity to connect with, and learn from, people all over the world. These connections can make the world smaller, help to build bridges and tolerance, and prepare our youth for the connected life of the future. Also, for youth who suffer from chronic disease, disabilities or who feel marginalized for other reasons, the Internet offers so many opportunities to learn and find support from people facing the same challenges. For so many people, youth included, the Internet can be a real lifeline.
So…rather than just saying, “Don’t do that!” I think parents need to do some real talking—and teaching.
Safety has to be first and foremost. Youth are naturally trusting, especially when someone is nice to them—and we all know how nice predators can act online. Parents need to help their teens understand that all is not necessarily as it seems; they need to be extremely careful with what they share online. They shouldn’t tell strangers where they live or go to school, for example. Telling secrets or saying bad things about people can work out badly too, if it turns out the new online friend can’t be trusted. And they must never, ever go to an in-person meeting with someone they met online unless an adult is present.
But really, very little about navigating online relationships is black and white. Each person and circumstance is a bit different. There are ways to gather data about strangers that can help you figure out if they can be trusted—but none of those ways are foolproof. There are also ways to have relationships online without putting yourself at risk—but those ways will vary depending on the situation. That’s why parents need to have ongoing conversations with their teens about what they are doing and who they are meeting online.
There’s no way a teen is going to have those conversations if all they hear from you is doom and gloom. They will figure you don’t understand. They will make friends online, and they won’t tell you about it.
So talk to your teens about the Seventeen blog, especially if they read it. See what they think, and talk with them about why online dating is a bad idea for them. But instead of having that be the end of the conversation, make it the beginning.
Dr. Claire recently spoke with New England Cable News on the subject of teens and online dating, watch her interview here:
Speaking to members of the House Democratic caucus on Capitol Hill, Mr. Obama said in answer to a sharp question from Representative Ed Perlmutter of Colorado that he believed Mr. Summers had been maligned in the liberal news media, according to several House Democrats who attended the meeting.
Representative Gerald E. Connolly, Democrat of Virginia, said the president described Mr. Summers as a rock of stability who deserved credit for helping to steer the American economy back from the financial crisis of 2008 and the ensuing recession. Mr. Obama, Mr. Connolly said, singled out the negative coverage of Mr. Summers in The Huffington Post.
During Wednesday’s meeting, one Democratic lawmaker, who requested anonymity, said the president became agitated and rose to Summers’ defense in response to Rep. Ed Perlmutter (D-Colo.) walking up to the microphone and simply saying, "Larry Summers. Bad Choice."
In paraphrasing Obama's response, the lawmaker said the president replied, "Hey, don't talk sh*t about him because he's actually a pretty good guy. And then he said, 'If somebody talked sh*t about you like that, I'd defend you too."' (The lawmaker added that Obama didn't use the expletive.) …
After Wednesday's meeting, Rep. Gerry Connolly (D-Va.) told reporters that Obama said he felt Summers has been treated unfairly.
"He gave a full-throated defense of Larry Summers and his record in helping to save the economy from the dark days of '09,” he said. "I mean, Larry Summers is a very capable person. I think the president showed real moxy in rising to his defense."
by Harry Lewis (noreply@blogger.com) at August 01, 2013 02:30 AM

Earlier this year, the Third Coast International Audio Festival ordered up an auditory feast with their annual ShortDoc contest. The challenge was: make a two- to three-minute radio story on the idea of “appetite,” serve it up in three “courses” (i.e. chapters), and title it with one of the five tastes. (Yes, umami counts.)
Team Third Coast sorted through nearly 250 submissions and hand-picked eight superior ShortDocs. And they’re turning it over to We, The People to pick the best one for a “People’s Choice” award.
Check out the menu here, and then vote for your favorite here!
These ShortDocs will not be on the table for long. Voting closes today, July 31 — BUT, word on the street is that voting will stay open until 11:59pm Hawaii time, or 5:59am on August 1 on the east coast. Get your fill of ShortDocs while they last!
If we want the best among our Guardians, we must take those naturally fitted to watch over a commonwealth. They must have the right sort of intelligence and ability; and also they must look upon the commonwealth as their special concern--the sort of concern that is felt for something so closely bound up with oneself that its interest and fortune, for good or ill, are held up to be identical with one's own.The great and admirable leaders, of universities and the nation, all think that way, don't they?
by Harry Lewis (noreply@blogger.com) at July 31, 2013 03:24 PM
This Friday, at the very #rare time of 5pm and at a rather lovely spot, I’m psyched to be opening for two of my favorite (erstwhile) local talents: Rizzla DJ & False Witness, the two from the #KUNQ crew who cooked up the time-warped, globally-warmed, zombie beach party of Isla Toxico –



While we won’t exactly be performing on a toxic island (though you might consider Boston such at times), we will be right on the water, at the Institute of Contemporary Art’s waterfront space — definitely one of the nicer sunsetting spots in the city. They’re calling the event Urban Beach, which yeah, but I think that’s a theme we can all work with.
From 5-6:30pm I’ll be doing my best to level the vibes. Playing before dark can be as liberating as it is constraining, so I’m looking forward to the chance to play things that diverge from club imperatives. Haven’t had a chance to play a sun-drenched set in a while. Or slow music, for that matter.
Back to my cohorts, though: I couldn’t be happier to play opening act for these two. It’s been a pleasure to watch Rizzla’s distinctive productions and insurgent sets get the uptake they deserve. He & the whole #KUNQ crew bring together such a great set of shared & individual sensibilities, and the results manage to challenge as they seduce. Take, as another example, the very latest c/o Rizzla & Blk.Adonis –
Crossfading and fusing a special & specific array of styles, their shifting constellation of soca, hardstyle, ballroom, dancehall, and reggaeton transmits a finely-tuned address to a particular (if cross-sectional & always emergent) public — a musical beacon which worked wonders here in Boston back when Rizzla & co. were all resident here & giving this old town some NU LIFE.
That a progressive / queer / genderqueer / feminist / anti-racist / inclusive movement would rally around brusque dancehall anthems and raved-up dembow speaks volumes. Eschewing the imperial work that appropriation does, the #KUNQ approach gestures toward the more complex possibilities that emerge when we embrace difference. “Get to know it,” Rizzla says, “and you won’t want to rip it off.”
But he may have said it even better when he said –
Fortunately for us toxic islanders of Greater Boston, NYC is not too far away, so we’re still graced with the #KUNQ crew’s presence on a relatively regular basis. Hearing some #KUNQ beats echo across the harbor this Friday sure sounds like a vibes to me. Maybe you too?
Job Description
PRX Remix Assistant Producer
THE OPPORTUNITY
The PRX Remix Assistant Producer will seek, find, screen, and recommend great story-driven audio for use on PRX Remix. If you are a critical and even somewhat obsessive listener to public radio, podcasts, intriguing sound of various kinds, and can pick out the most compelling audio from the mediocre and mundane, your curatorial ears might be just the right fit for PRX Remix.

Working under the direction of PRX Remix Program Director Roman Mars, and coordinating with PRX editorial staff, you will also get your hands dirty writing, producing, recording, editing, voicing and interviewing to create short-form interstitial content for use on PRX Remix. You will be in frequent touch with producers and podcasters about using their work, and communicate closely with the PRX team.
Strong candidates should have:
We strongly prefer that the producer work out of our Cambridge, Mass. office. However, if you are interested and do not live here, you may still apply. This is a part-time opportunity with some flexibility in scheduling. Exact hours and days per week will be dependent on the chosen applicant.
To apply, upload your cover letter and resume here. Please email jobs [at] prx [dot] org with any questions.
About PRX Remix
PRX Remix is for people who love to listen to great stories. We handpick the best short works from shows like The Moth, 99% Invisible, and Snap Judgment, from independent radio makers on PRX.org, and from podcasters everywhere. Then we mix it up in a never-ending stream. PRX Remix can be heard on XM 123, mobile apps, and stations around the country. Learn more.
The Obama administration's war on leaks and, by extension, the work of investigative reporters, has been unrelenting
The American journalism trade is breathing a collective – but premature and, in many cases, grossly hypocritical – sigh of relief today. A military judge has found Bradley Manning guilty of many crimes, but "aiding the enemy" isn't one of them.
Had the judge found Manning guilty of aiding the enemy, she would have set a terrible precedent. For the first time, an American court – albeit a military court – would have said it was a potentially capital crime simply to give information to a news organization, because in the internet era an enemy would ultimately have been able to read what was leaked.
However, if journalism dodged one figurative bullet, it faces many more in this era. The ever-more-essential field of national security journalism was already endangered. It remains so. The Obama administration's war on leaks and, by extension, the work of investigative reporters who dare to challenge the most secretive government in our lifetimes, has been unrelenting.
The Manning verdict had plenty of bad news for the press. By finding Manning guilty of five counts of espionage, the judge endorsed the government's other radical theories, and left the journalism organization that initially passed along the leaks to the public, Wikileaks, no less vulnerable than it had been before the case started. Anyone who thinks Julian Assange isn't still a target of the US Government hasn't been paying attention; if the US can pry him loose from Ecuador's embassy in London and extradite him, you can be certain that he'll face charges, too, and the Manning verdict will be vital to that case.
The military tried its best to make life difficult for journalists covering the Manning trial, but activists – not traditional journalists – were the ones who fought restrictions most successfully. Transcripts weren't provided by the government, for example. Only when the Freedom of the Press Foundation crowd-sourced a court stenographer did the public get a record, however flawed, of what was happening.
That public included most of the press, sad to say. Only a few American news organizations (one is the Guardian's US edition) bothered to staff the Manning trial in any serious way. Independent journalists did most of the work, and did it as well as it could be done under the circumstances.
The overwhelmingly torpid coverage of this trial by traditional media has been yet another scandal for the legacy press, which still can't seem to wrap its collective brain around the importance of the case, and especially its wider context. National security journalist Jeremy Scahill summed it up after the verdict when he told Democracy Now: "We're in a moment when journalism is being criminalized."
For those who want to tell the public what the government is doing with our money and in our name, there are new imperatives. Governmental secrecy, surveillance and the systematic silencing of whistleblowers require updated methods for journalists and journalism organizations of all kinds. Americans pursuing this craft have to understand the risks and find countermeasures.
That is not enough. The public needs to awaken to the threat to its own freedoms from the Obama crackdown on leaks and, by extension, journalism and free speech itself. We are, more and more, a society where unaccountable people can commit unspeakable acts with impunity. They are creating a surveillance state that makes not just dissent, but knowledge itself, more and more dangerous. What we know about this is entirely due to leakers and their outlets. Ignorance is only bliss for the unaccountable.
In my reading, MIT does not come off as cleanly in Hal Abelson’s excellent report as Pres. Reif’s spin suggests.
When Pres. Reif writes that MIT’s actions were “reasonable, appropriate and made in good faith” I think we have to ask “Appropriate to what?” To MIT’s interests as a legal entity? Very likely. To MIT as a university? Not in my book. I won’t try to adjudicate the claims that MIT cooperated eagerly with the prosecutors but dragged its feet with the defense; I’m too emotionally involved to trust my reading of the evidence in the Abelson report. But, MIT’s timid “neutrality” wasted an opportunity to stand against the unreasonable and inappropriate tactics of the prosecutors, and to stand for the spirit of inquiry, openness, innovation, and risk-taking that has made MIT one of the world’s great universities.
I understand that MIT wasn’t going to say that it was fine with Aaron’s breaching its contract with JSTOR. But MIT could have stood against prosecutorial overreach, and for the values— if not the exact actions— Aaron embodied.
Larry Lessig has posted incisive comments about MIT’s neutrality.
This week’s picks for station: New STEM Story Project pieces, summer music, and young people around the world.
Want to get weekly station newsletters via email? Subscribe.
|
(This is the second part of a two-part post. In Part One, Bryce Newell examined the implications of government collection and analysis of metadata relating to electronic communications. Today, Bryce picks up from where he left off, considering the implications of government surveillance under different conceptions of freedom.)
The International Covenant on Civil and Political Rights (ICCPR), a widely ratified international human rights treaty, includes provisions that relate to liberal and republican conceptions of freedom that are relevant to current discussions about mass government surveillance and communications intelligence gathering. Article 17 of the ICCPR states that, “No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation.” Article 18 guarantees the freedoms of thought, conscience, and religion, and Article 19 guarantees the “right to hold opinions without interference” and the “right to freedom of expression.” The European Convention on Human Rights, an important regional – rather than truly international – treaty, also provides similar protections, as do the constitutions and charters of many other democratic countries. The relevance of these treaties and philosophical accounts of freedom are tied directly to all three of the questions posed in my first post in this series but, I think, they are most interesting when applied to the third question: what transparency and oversight mechanisms ought to govern the collection of communications information by governmental intelligence agencies?
As a starting point, we should seek to identify the oversight mechanisms that are currently in place. In the United States, much of this recently discussed surveillance takes place under the auspices of the Foreign intelligence Surveillance Act (FISA), as amended by the U.S. PATRIOT Act of 2001 and FISA Amendments Act of 2008 (among others). In Canada, the Communications Security Establishment (CSEC) relies on the interconnected provisions of a number of laws, such as the National Defence Act (NDA), the Criminal Code, the Privacy Act, and the Canadian Charter, as well as a number of classified Ministerial Directives (MDs) and Ministerial Authorizations (MAs) that authorize specific activities, including the incidental collection of information about domestic persons. In the United Kingdom, the activities of the Government Communications Headquarters (GCHQ) are regulated by the Regulatory Investigatory Power Act. All of these agencies are restricted in various ways from targeting domestic persons, and their respective security and human intelligence-oriented agency partners (e.g. the CIA and FBI in the U.S., CSIS in Canada, and MI-5 and MI-6 in the UK) and law enforcement agencies are also somewhat restricted in their abilities to collect information about domestic persons without appropriate grounds to do so (e.g. some form of warrant, administrative subpoena, or ministerial authorization).
However, as made clear in recent public congressional hearings and court cases in the United States, much of the legal basis for individual surveillance programs (e.g. the classified decisions of the Foreign Intelligence Surveillance Court (FISC)) remain undisclosed. Federal agencies claim multiple layers of oversight by other agencies (the FISC, Justice Department, and attorney’s in multiple offices), and Congress does get private briefings from intelligence chiefs that are not made public. Canadian intelligence agencies are also subject to oversight by ministers and, in terms of CSEC, to an appointed and independent Commissioner. British intelligence is likewise subject to oversight. The problem, from the point of view of the ordinary people, is not necessarily with the existence of the surveillance programs themselves; rather, it is largely with the lack of transparency and openness (perhaps combined with a certain level of distrust in government officials who have authorized these programs for a number of years).
In one recent U.S. case, Clapper v. Amnesty International, the Supreme Court held that a number of human rights and other organizations could not challenge the constitutionality of the provisions of FISA, which authorizes the government to intercept communications between U.S. citizens and foreign nationals (or those suspected of being foreign nationals), and to maintain secrecy about whose correspondence the government has intercepted. The organizations claimed that, because of their regular communications with overseas persons, there was an “objectively reasonable likelihood that their communications will be acquired… at some point in the future” and that the threat of this acquisition had caused them to take costly preventative measures aimed at preserving the confidentiality of their communications. Despite the fact that, due to the law’s secrecy requirements, the government is the only entity that knows which communications have been intercepted, the Supreme Court held that third-parties like Amnesty International do not have standing to challenge the Act because they cannot show that they have been harmed (precisely because they don’t have access to information about the government’s surveillance activities).
The government has defended its secrecy in a number of prior and ongoing cases (see here, here, here, here, and here for a few examples). And while some secrecy is undoubtedly necessary (as even the ACLU admits) to properly enable these agencies to prevent attacks on domestic and international soil, the current discussions about the extent of that secrecy is important and severely needed. Currently, we don’t know much about what electronic surveillance activities governments are engaged in, what types of information they are collecting as part of these activities, what use this information is being put to, and the FISA Court’s interpretations of the legal basis for each of the programs. The recent disclosures, and others in the past, have shed some light on these issues, but they are far from comprehensive.
In cases challenging secret surveillance in Europe, the European Court of Human Rights (ECtHR) has stated, in terms that resonate with both the civic-republican notion of freedom as the lack of susceptibility to arbitrary domination and the more liberal idea of freedom as noninterference, that:
[T]he mere existence of legislation which allows a system for the secret monitoring of communications entails a threat of surveillance for all those to whom the legislation may be applied. This threat necessarily strikes at freedom of communication between users of the telecommunications services and thereby amounts in itself to an interference with the exercise of the applicants’ rights… irrespective of any measures actually taken against them. [Case of Liberty and Others v. The United Kingdom, ¶ 56]
According to Neil Richards, an American law professor,
The bottom line about surveillance and persuasion is that surveillance gives the watcher information about the watched. That information gives the watcher increased power over the watched that can be used to persuade, influence, or otherwise control them, even if they do not know they are being watched or persuaded. [link to PDF at Harvard Law Review]
These ideas, in concert with the requirement in the ICCPR that “No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence,” demonstrate the need for substantive discussion about government surveillance and effective oversight by governmental bodies, elected officials, news media, and the public themselves. This may require the declassification of certain documents, including legal interpretations of laws authorizing surveillance activities, and a greater level of transparency (such as proposed by the Ending Secret Law Act). Either way, secretive government surveillance may actually interfere with our rights to intellectual or informational privacy – the rights to access, acquire, and use information, and to control information about ourselves. It might also infringe on our rights to free expression and chill speech, including both speech itself and searching for information as a predicate to speaking. If, however, we cannot access knowledge about whether we have, in fact, been interfered with (including high-level information about general government practices), then approaching the question from a position informed by civic-republican concerns about limiting arbitrary domination becomes highly relevant and important. As presented above, “the mere existence of legislation which allows a system for the secret monitoring of communications entails a threat of surveillance” and, if that power is exercised in an arbitrary fashion, has serious implications for our freedoms, regardless of whether we have actually been sucked into the NSA data vacuum.
On the other hand, increased transparency may also have some negative impact on the efficiency and effectiveness of counter-terrorism efforts. These interests are not insubstantial. As a result, transparency measures should be tailored to respect legitimate national security interests. Ultimately, however, some additional transparency – and some reduction in potential domination – might be justified precisely because it increases the political freedoms of the people. We might claim greater democratic interest in information about secret legal interpretations or methods used to conduct surveillance than about the substantive information actually collected – but we might draw lines in different places as well. Actual interference needs to be adequately justified, and the potential for arbitrary domination by governments must be severely limited. How best to address these concerns is a pressing question. Regardless of where we ultimately end up, answering and debating these questions will have important ramifications for how we think about our political freedom – and how much freedom we ought to let slip away for the sake of security.
Bryce Clayton Newell is currently a Google Policy Fellow at the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC) at the University of Ottawa Faculty of Law in Ottawa, Ontario. He is also a doctoral student in Information Science at the University of Washington in Seattle, Washington, a licensed attorney (California, inactive) and a documentary filmmaker. The opinions expressed in this post are those of the author alone, and not of any of the organizations to which he is affiliated. The author thanks CIPPIC for allowing him to pursue this line of research as part of his related work at the Clinic. Bryce can be reached at bcnewell@uw.edu.
(Image: NSA slide leaked by Edward Snowden describing the PRISM surveillance program)
In The Mobile Customer as Data vs. Customer Data, Chuck Martin in MediaPost‘s Mobile Shop Talk says this:
The world of data tracking for mobile commerce is getting much more precise.
The phone knows where the phone goes, as we all know. And that knowledge can be used to help provide better services to those carrying them.
Any driver using Google Navigation, for example, gets the benefit of other phones being tracked to identify bottlenecks on roads ahead. The next step was for Navigation to automatically re-route your trip to avoid the traffic jam, so the benefit became seamless.
The tracking of phones at retail also is being used in efforts to provide a better shopping experience.
In these cases, the value comes from the data about the phone being tracked, not information about the person.
This is about the use of customers as data rather than data about the customer.
This data about phone movements already is being used at hundreds of stores ranging from small mom-and-pop shops to national chains and shopping centers.
He goes on to talk about Euclid, “a three-year-old California company that likens what it does to Google analytics but for the physical world.” And he explains what they do:
Rather than tracking phones by apps, sign-ins, GPS or cell tower, Euclid installs sensors at stores to capture MAC addresses, which are part of every smartphone.
The company doesn’t capture any information about the person, just the identification of smartphones that are on with Wi-Fi enabled.
The idea is to map shopper traffic and analyze how stores can become more effective. The large volume of aggregated data of phone traffic patterns is what provides the value.
Here is what I put in the comments below (with paragraph breaks and links added):
I am a customer. I am not data. I do not wish to yield personal data, even if anonymized, to anybody other than those with whom I have a fully consenting, non-coercive and respectful relationship.
I do not wish to receive offers as a matter of course, even if machines following me guess those offers might might be relevant — especially since what I am doing most of the time is not shopping.
I also don’t wish to have a “better experience” with advertising inundation, especially if the “experience” is “delivered” to me rather than something I have for myself.
Familiar with Trader Joes? People love them. Know why? They do none of this tracking jive. They just talk, as human beings, to customers. There’s no way to automate that, and they save the overhead of marketing automation as well.
Now think of the “mobile experience” we call driving a car, or riding a bike. Our phones need to be the same: fully ours. Not tracking devices.
I know mine is a voice in the wilderness here, but I’m not alone. It’s not for no reason that the most popular browser add-ons are ad and tracking blockers. That’s the market talking. Marketers need to listen.
In a commencement speech this past May, former presidential speechwriter @JonLovett says this (around 14:30): I believe we may have reached peak bullshit.
He continues: I believe those who push back against the noise and the nonsense, those who refuse to accept the untruths of politics and commerce and entertainment and government, will be rewarded. And that we are at the beginning of something important. He also pushes back on what he calls “a process that is inauthentic.” (Here’s a transcript.)
Here’s what’s real: For whatever reasons, we blew it by not building browsers to be cars and bikes in the first place. Same with smartphones and tablets. We gave wonderful powers to users, but greater powers to companies that would rather track us than respect us, who would rather “deliver”us the “experience” they want us to have than equip us to operate as fully human beings in the world — beings with independence and agency, able to engage in our own ways, and on our own terms.
So, what we’ve got now, nice as it is in many ways, is a feudal system. Not real freedom.
It’s a feudal system run by advertising money, and it is worse than broken: it looks to its masters like it isn’t working well enough. Those masters include lots of good people trying to do the Right Things. But they aren’t listening, because they are too busy talking to each other. The whole marketing ecosystem is an echo chamber now. And we, the users and customers of the world, are not in it, except as magnets for tracking beacons and MAC addresses sold to marketing mills.
There is now a line in the sand. On one side is industrial control of human beings, and systems that “allow” degrees of freedom. On the other side is freedom itself. On that side also lies the truly free marketplace.
Here’s a bet. A lot more money will be made equipping individual human beings with means for enjoying full agency than there is today in “delivering” better sales “experiences” to them through browsers and phones that aren’t really theirs at all.
And here’s betting we’ll get better social effects too: ones that arise from freedom of association in an open world, rather than inside giant mills built for selling us to advertisers.
United Kingdom
News reports and online discussions on freedom of expression have been dominated this week by Prime Minister David Cameron’s proposals to require ISP-level anti-pornography filters. Cameron’s motivations for the proposal have been questioned, especially after ISPs disclosed that the filter settings include blocks for many other kinds of online content such as social networking, gambling, file sharing, or sites concerned with drugs, alcohol and tobacco. The UK government’s reliance on the Chinese telecom firm Huawei to maintain the list of blocked sites and the decision to turn the filter on by default, requiring users to opt-out of filtered access, has prompted strong responses from freedom of expression and privacy advocates. Adding to the controversy, hackers posted pornographic images on the website of Claire Perry, one of the architects of the ISP-level filters. Perry’s response generated more controversy when she accused the blogger who reported the hack as being responsible for the content; critics argue her responses demonstrate a poor understanding of digital technologies.
Russia
It’s been a controversial week for the Russian Internet. The country’s recent waves of violence against members of the LGBTQ community have been facilitated by social networks, which vigilantes use to identify and physically locate victims, and by the ability to share bullying videos online. The U.S. has also identified several young Russians behind top U.S. cyber thefts in the last seven years, leading to arrests and extraditions. Finally, the head of the Russian State Duma’s Committee for Family, Women, and Children has proposed modifications to Russia’s existing content rules to block bad language from social networks, websites, and forums. Earlier this year, Russia banned swearing from its media outlets and prohibited countries from making products featuring swear words. Also, today Ilya Segalovich, the co-founder of Russia’s largest search engine Yandex, has died.
Australia
Shortly after the UK announced it would be requiring ISPs to filter adult content, the Australian Christian Lobby announced it would be renewing its campaigns to block porn in Australia. In 2008 Australia attempted to pass similar porn-blocking legislation, but lack of popular support killed the proposed plan when the Coalition government refused to vote on the matter. At the same time, Australia’s Parliamentary Inquiry into the higher prices charged by IT companies selling hardware, software, and digital downloads in Australia recommended that the Australian government educate consumers in circumventing the geolocation tools used by IT companies to determine where buyers are located. The Inquiry also required testimony from representatives of Apple, Adobe, and Microsoft as to the reasons for the higher prices, but found these companies could not satisfactorily explain the reason for increasing product prices when sold to people in Australia.
United States
This week, an anonymous web developer claimed that the U.S. government is requiring companies to turn over encryption keys. The U.S. government has so far denied the claims and some companies, like Microsoft and Google, have declined to say whether the government has made any such requests, but indicate they will not comply if asked for server-to-server email encryption keys. Also, an Internet monitoring company released a study which found that Google is responsible for 25% of all Internet traffic in North America, which is more than Facebook, Netflix, and Instagram combined. This is up from 6% of Internet traffic in 2010. Finally, a Texas man was charged this week for creating an operating a Bitcoin Ponzi scheme worth approximately $65 million at today’s exchange rate. The scam involved using money from new investors to make “interest” payments to earlier ones and to cover withdrawals.
(Following on from Rebekah Bradway's post last week regarding government-created metadata as public records, we are pleased to present a two-part post from Bryce Newell on the role of metadata in government surveillance. -- Ed.)
As much of the world is now undoubtedly aware, the National Security Administration (NSA), and many other signals intelligence agencies around the world, have been conducting sophisticated electronic surveillance for quite some time. Many might have expected that such extensive surveillance was occurring, both domestically and globally, prior to Edward Snowden’s release of classified information in June 2013. Indeed, we’ve known about the existence of government driven metadata surveillance and international intelligence cooperation and data-sharing for years. The UKUSA Agreement, which links intelligence agencies in the United States, United Kingdom, Canada, Australia and New Zealand, was declassified by the NSA in 2011, but its existence was reported much earlier.
What we haven’t known, perhaps, are some of the specifics (e.g., here and here) brought to light by the recent revelations – or much about the legal analysis and oversight to which such surveillance activities are subjected to in practice. The fallout from Snowden’s disclosures has not been limited to the U.S. either. News media in both Canada and the U.K. have released documents indicating that agencies in these countries are also conducting similar programs.
Much of this surveillance appears limited to the metadata – information about information – associated with telephone calls, emails, and other forms of electronic communications. Officials are claiming that metadata is less revealing than the actual contents of our communications – although who hasn’t sent or received an email that included all of its content in the subject line? However, as our landline-initiated telephone calls of the past have been largely supplanted by cellular phone, wireless, and Internet-based communication, the amount of metadata – and its ability to ascribe revealing attributes about us – has grown tremendously. Correspondingly, much more can be done with this data to reveal personal information.
Prior to mass adoption of and access to the Internet, our electronic communications metadata consisted of logs indicating which numbers we called from, what numbers we called, at what time we made these calls, and how long the calls lasted. Much of this information is what we expect to see on a phone bill from our phone company. Now, however, our communications metadata often includes (among other things) highly accurate geo-location information (including movement information) sourced from cell towers, GPS chips embedded in our devices, and the presence of available WiFi connections nearby. This information is not just useful to determine where we are or where we happened to be when we made or received a call, but may also indicate who we are traveling with (based on geo-location metadata of those nearby). As researchers have shown, small amounts of otherwise anonymous geo-location data from cellular phone networks can be used to accurately identify individuals with high levels of confidence.
This is not to say that eye-opening things could not be done with very simple sets of metadata about us in the past, as shown by this fascinating post by Professor Kieran Healy about using basic social network analysis to identify networks of people suspected of anti-government activities (in that case, Paul Revere and the rebellious colonists in America). Imagine what sophisticated statistical and social network analysis can do when we add large amounts of additional information and dramatically increase the number of subjects under study and the resources available to study them. The David Petraeus scandal has also shown us that metadata, like IP addresses indicating the locations where Petraeus and Paula Broadwell logged into their anonymous shared email account, can be enormously helpful in identifying people and telling stories about their lives.
Not only are we confronted with difficult questions about how we ought to define reasonable expectations of privacy in relation to metadata or regulate government surveillance for national security purposes (which is admittedly a highly complex and difficult question), but we also live in a world where information knows no borders. Domestic legal protections, often requiring an intelligence agency to demonstrate some level of reasonable suspicion or probable cause prior to collecting some types of personal information on domestic persons, potentially disappear when foreign governments also collect and share similar data. Much has been said about the need for human rights-based protections to regulate transnational intelligence networks and cross-border collaboration, but these suggestions also pose tremendously difficult questions (not that this means the discussion is not very much worth having).
At least three primary questions need to be addressed (of course, each raises a host of ancillary questions as well). First, what types of communications information should we allow governments to acquire without restrictions? Second, what procedural and legal hurdles should be put in place to protect “personal” or “private” information (and e.g. where does metadata fall on the spectrum from public to private)? Third, what transparency and oversight mechanisms ought to be put in place to ensure that governments are abiding by the policies put in place? We should not necessarily be dismantling our signals and communications intelligence infrastructures, but we should be ensuring – and have the power to ensure – that they operate within legal and democratically sanctioned ways that do not impermissibly infringe on our political freedoms (a form of what Philip Pettit has called “antipower”).
Freedom itself is a contested ideal. Isaiah Berlin famously differentiated between negative and positive liberties; negative meaning the absence of interference, positive meaning (on some accounts) that a person is free to the extent they achieve autonomy, self-mastery, or self control. Negative liberty, or the idea that a person is free to the extent they are not actually interfered with, maintains a preferred position in much contemporary liberal literature. The concept of positive liberty is more controversial, since it may involve determinations about whether a person has the ability to act on their second-order desires – for example, a person may desire not to desire something; their “true” desire may be to overcome a certain trait, addiction, or habit, for example, but they are not free unless they can actually act on these higher desires. On the other hand, a civic or neo-republican account of freedom, such as that offered by Philip Pettit, is primarily concerned with liberty in a negative sense, but equates freedom with nondomination. Domination, on this account, is not limited to instances of actual interference but extends to the possibility that such interference could be realized by the existence of power relationships that allow one person (or state) to arbitrarily interfere with another at will.
In part two, I explore how both of these negative conceptions of freedom – noninterference and nondomination – are implicated by the realities recently disclosed by Edward Snowden, with references to the International Covenant on Civil and Political Rights and case law from the U.S. Supreme Court and European Court of Human Rights.
###
Bryce Clayton Newell is currently a Google Policy Fellow at the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic (CIPPIC) at the University of Ottawa Faculty of Law in Ottawa, Ontario. He is also a doctoral student in Information Science at the University of Washington in Seattle, Washington, a licensed attorney (California, inactive) and a documentary filmmaker. The opinions expressed in this post are those of the author alone, and not of any of the organizations to which he is affiliated. The author thanks CIPPIC for allowing him to pursue this line of research as part of his related work at the Clinic. Bryce can be reached at bcnewell@uw.edu.
(Photo of NSA Headquarters courtesy of www.nsa.gov, in the public domain pursuant to 17 U.S.C. § 105.)
Old friends Old Money Massive have released the best damn rap album I’ve heard in lightyears.
Obvi, we’ve been fans at W&W since “African Kids” — and I’m happy to have had a little hand in bringing Old Money to Boston a couple times. They’ve been leaking flames in the form of tracks & videos for daze, but I’m beyond thrilled that they finally brought their bracing vision to the world in the shape of a restless but deeply coherent “mixtape” (along with assorted transmedia objects, as I’ll note below).
There’s a lot I could say about the sui generis afropessimystic futurism they’ve encrypted for this zipfile, but just go ahead and listen for yourself, and be sure not to skip the bumboclaat intro –
If you need a little more of a hermeneutical angle, their official bio offers hints —
Ahmad Julian and Andre Oswald are Old Money, a New York based rap, production and DJ duo of Jamaican and Guyanese origins. Their music incorporates the sounds of contemporary Africa such as UK Funky, Dancehall, Kwaito, Kuduro and Hip-Hop while remaining rooted in traditions of pan-African philosophy. In this way, their output remains dynamic and cutting-edge, while also taking on a mystical bend – influenced by fringe spiritual orders like the Nuwaubians, the Moors, NOI, and The 5 Percenters, as well as science fiction novels by author Octavia Butler.
But you can also get the gist from ish like this, the vivid video for “Rumble In Tenochtitlan” –
Very helpful and generous of the duo, their “Certified Space Trade Mix” — with matching Dr.Bronner’s inspired t-shirt! — provides a broader, and at once more specific, sense of the musical and philosophical background underpinning their sound:
Finally, a great interview over at Dazed Digital (including a brief, funny, and much appreciated shoutout to yours truly) offers further angles to consider while you nod along to the beats. Here’s the pulliest of pull quotes, a good glimpse into what shapes Old Money’s aesthetic –
Dazed Digital: You were brought up in the Bronx and Brooklyn. How did growing up in the boroughs of hip hop’s birth influence you?
Ahmad Julian: Tremendously, though I’d say it influenced us more so in the past than it does now, at least musically speaking. Of course, certain things stay with you – a certain awareness, a certain paranoia, how you carry yourself, sartorial choices, vernacular, etc. But at this point I’d say equally important as far as influence goes would be the internet and our travels, which have enabled us to connect dots where we might not have otherwise. All of this, hopefully, comes through in the music.
Fire in the dark, seen. Gwaan catch the spark already. Blackstar Galactica been boarding…
Todos mesmo. e se pode treinar muita gente pra atingir o que, aos olhos de não especialistas, é trabalho de grande mestre. é isso que dafen, a vila dos pintores perto de shenzen, na china, faz. o lugar é uma combinação centro de pintura, exposição e negociação de obras de arte de classe mundial. a combinação, se você quiser, e parte das obras de arte, se você não ligar muito pra detalhes.
mas… olhe a imagem abaixo.
no ateliê da jovem artista, há dois girassóis de van gogh. não se assuste. na cidade toda há dezenas de milhares, e a qualquer hora que você passar por lá. olha um no chão aí na imagem a seguir… no meio da rua.
tudo bem que até van gogh copiou ele mesmo e os quadros nas duas fotos não são os primeiros, nem são cópias tão ruins assim. o único girassol vendido em tempos recentes foi em 1987, pelo que hoje seriam mais de R$180 milhões [e se parece com o da primeira foto]. os girassóis “da segunda foto” estão na neue pinakothek, em munique, e a história dos de van gogh está neste link. ah, sim, um girassol de van gogh básico custa cerca de R$80 em dafen e um de qualidade comparável aos que estão nos museus sai por umas 10 vezes mais. 800 pila por um van gogh… de museu? redes de hotéis, empresas, galerias… encomendam às dezenas. o wal mart, às dezenas de milhares. quase 3/4 da produção é exportada.
certo. e isso mostra o que? mostra que a china conseguiu resolver o problema de copiar obras de arte –e grandes mestres- com algum nível de fidelidade e a preços de wal-mart. em dafen, 5.000 artesãos fazem cerca de 5 milhões de pinturas por ano, o que dá 1000 quadros por pintor por ano, quase 3 por dia. vida de artista, na china, não é moleza. e isso atrai muita gente, pois vocação e treinamento podem lhe conseguir um emprego e renda muito melhor do que montador de iPhone na foxConn, onde a vida é decididamente muito mais dura. abaixo, uma competição de cópia para ser avaliada em fidelidade e tempo de execução, pra escolher alunos para treinamento avançado e, quando for o caso, contratar pintores.
você, amante da arte, está escandalizado com tal cenário? então prepare-se para a próxima rodada: breve, vai dar pra fazer a mesma coisa, com o mesmo efeito, sem qualquer humano no processo. na universidade de konstanz, pesquisadores estão trabalhando em um robô capaz de fazer o que os pintores de dafen fazem: analisar em grande detalhe uma imagem a ser copiada e fazer uma cópia quase tão perfeita quanto o original. o sistema ainda está nas primeiras versões, é baseado num braço de solda de uma linha de montagem… mas tem muito software e muita criatividade por trás, lá com a galera que está desenhando o sistema e desenvolvendo código.
claro que o propósito [ainda?] não é concorrer com dafen, mas tentar entender a pintura de forma computacional: a tese é que pintura é um processo de otimização no qual cor é distribuída sobre tela até que o pintor possa reconhecer o conteúdo [como o que queria “pintar”]. o sistema é simples e mostrado na imagem a seguir.
e olha o que ele já faz, imitando rembrandt… daqui a quanto tempo você acha que eDavid [o nome do robô pintor; veja o projeto aqui] pintará esta tela como rembrandt, e não para ser um pintor como rembrandt? se bem que… bem, deixa pra lá; o rembrandt original está aqui.
pra terminar…. veja eDavid “pintando” no vídeo abaixo. imagine as possibilidades. talvez, volte para o título e pense em incluir robôs no “todos”, lá. pra refletir mais, passe por uma série deste blog [reescrevendo humanos em software] e pense se vamos reescrever tudo, até nossas próprias funções em software.
eu acho que vamos. e a série é sobre isso. melhor aprender a programar…
I'm spending this week at the Future of Learning Institute, so it seems like a good time for a long overdue review of Marina Gorbis's The Nature of the Future: Dispatches from the Socialstructed World.
Gorbis is the director of the Institute for the Future in Palo Alto, an organization with connections to a variety of people interested in the nexus of education, technology, and the future, like Howard Rheingold, Jane McGonigal, and John Seely Brown. The Institute for the Future spends their time imagining the ways that social forces and technological change will reshape the world. They are optimistic but not Utopic; they believe that the world can be made to be a better place, and we should try, even if we know, historically, that change always produces winners and losers. They are deliberately provocative and imaginative in assuming that the future will not merely be a few tweaks on the present, and they intentionally coin new words and terms to help create a language for describing things which don't quite yet exist.
One of the terms Gorbis introduces in her new book is "socialstructing," the process of leveraging networked connections to allow individuals or small communities to create societal changes that previously required large institutions or movements to enact. An easy example is Wikipedia: a few people make a framework for sharing knowledge and suddenly information distribution is fundamentally changed.
Interestingly, part of Gorbis's inspiration for this framework comes from her upbringing in the Soviet Union, where her widowed mother raised her in a social economy short on hard currency but long on barter, care and community. She saw the wealth that people could generating by caring for each other, and she's bullish about the possibility of technology to expand these kinds of communities across geographic boundaries and reduce the transaction costs of bartering to make sharing ever easier and more compelling.
In education, this means that people can share their talents. Part of her vision of the future of learning relies on adaptive technologies to personalize learning experiences, clocks that understand your brain cycles, and optical devices that allow people to create informational overlays on the real world. But the heart of her vision is that technology will enable multigenerational learning communities, allowing for ad hoc learning groups, apprenticeships crossing geographic boundaries, and a personalization of learning that comes more from building relationships between experts and learners than from algorithms serving up learning objects. The video below is an engaging introduction to some of these ideas.
Marina Gorbis - Future of Education from IFTF on Vimeo.
What I've provided above is description rather than commentary. My own take on the future of education is probably more conservative: our social systems of education are extremely resistant to change, especially at the level where students and teachers interact. But the utility of futures thinking is not just about making predictions that are correct. It's about expanding our imagination, giving us new visions of what learning spaces might look like, challenging educators to look outside the sector for inspiration and tools for change.
What I enjoyed most about Gorbis' book is the challenge to envision the big leaps that might take us to a better world. That's a good frame of mind for me to be in as I spend the week working with other educators to imagine the future that we'd like to create.
For regular updates, follow me on Twitter at @bjfr and for my publications, C.V., and online portfolio, visit EdTechResearcher.
Check out the Future of Learning Twitter stream at #HGSEPZFOL.
- Justin Reichby Harry Lewis (noreply@blogger.com) at July 28, 2013 11:06 PM
Ziv Lotan Boyd was born into this world shortly after midnight on Sunday, July 28 after a movie-esque labor (complete with a NYC cabbie running honking like mad and running red lights to prevent me from delivering in the cab). The little ball of cuteness entered this world at a healthy 7 pounds, 13 ounces and we’re both healthy. We’re all doing well as we recover.
As per my maternity note, I have no idea what the days ahead may bring but please understand that I may be non-responsive for a while, especially when it comes to work-related requests. If you need my attention for something work related, please wait a while before approaching me. Thanks!
In 1960, the academic journal Technology and Culture devoted its entire Autumn edition [1] to essays about a single work, the fifth and final volume of which had come out in 1958: A History of Technology, edited by Charles Singer, E. J. Holmyard, A. R. Hall, and Trevor I. Williams. Essay after essay implies or outright states something I found quite remarkable: A History of Technology is the first history of technology.
You’d think the essays would have some clever twist explaining why all those other things that claimed to be histories were not, perhaps because they didn’t get the concept of “technology” right in some modern way. But, no, the statements are pretty untwisty. The journal’s editor matter-of-factly claims that the history of technology is a “new discipline.”[2] Robert Woodbury takes the work’s publication as the beginning of the discipline as well, although he thinks it pales next to the foundational work of the history of science [3], a field the journal’s essays generally take as the history of technology’s older sibling, if not its parent. Indeed, fourteen years later, in 1974, Robert Multhauf wrote an article for that same journal, called “Some Observations on the State of the History of Technology,”[4] that suggested that the discipline was only then coming into its own. Why some universities have even recognized that there is such a thing as an historian of science!
The essay by Lewis Mumford, whom one might have mistaken for a prior historian of technology, marks the volumes as a first history of technology, pans them as a history of technology, and acknowledges prior attempts that border on being histories of technology. [5] His main objection to A History of Technology— and he is far from alone in this among the essays — is that the volumes don’t do the job of synthesizing the events recounted, failing to put them into the history of ideas, culture, and economics that explain both how technology took the turns that it did and what the meaning of those turns meant for human life. At least, Mumford says, these five volumes do a better job than the works of three British nineteenth century who wrote something like histories of technology: Andrew Ure, Samuel Smiles, and Charles Babbage. (Yes, that Charles Babbage.) (Multhauf points also to Louis Figuier in France, and Franz Reuleaux in Germany.[6])
Mumford comes across as a little miffed in the essay he wrote about A History of Technology, but, then, Mumford often comes across as at least a little miffed. In the 1963 introduction to his 1934 work, Technics and Civilization, Mumford seems to claim the crown for himself, saying that his work was “the first to summarize the technical history of the last thousand years of Western Civilization…” [7]. And, indeed, that book does what he claims is missing from A History of Technology, looking at the non-technical factors that made the technology socially feasible, and at the social effects the technology had. It is a remarkable work of synthesis, driven by a moral fervor that borders on the rhetoric of a prophet. (Mumford sometimes crossed that border; see his 1946 anti-nuke essay, “Gentlemen: You are Mad!” [8]) Still, in 1960 Mumford treated A History of Technology as a first history of technology not only in the academic journal Technology and Culture, but also in The New Yorker, claiming that until recently the history of technology had been “ignored,” and “…no matter what the oversights or lapses in this new “History of Technology, one must be grateful that it has come into existence at all.”[9]
So, there does seem to be a rough consensus that the first history of technology appeared in 1958. That the newness of this field is shocking, at least to me, is a sign of how dominant technology as a concept — as a frame — has become in the past couple of decades.
[1] Techology and Culture. Autumn, 1960. Vol. 1, Issue 4.
[2] Melvin Kranzberg. “Charles Singer and ‘A History of Technology’” Techology and Culture Autumn, 1960. Vol. 1, Issue 4. pp. 299-302. p. 300.
[3] Robert S. Woodbury. “The Scholarly Future of the History of Technology” Techology and Culture Autumn, 1960. Vol. 1, Issue 4. pp. 345-8. P. 345.
[4] Robert P. Multhauf, “Some Observations on the State of the History of Technology.” Techology and Culture. Jan, 1974. Vol. 15, no. 1. pp. 1-12
[5] Lewis Mumford. “Tools and the Man.” Techology and Culture Autumn, 1960. Vol. 1, Issue 4. pp. 320-334.
[6] Multhauf, p. 3.
[7] Lewis Mumford. Technics and Civilization. (Harcourt Brace, 1934. New edition 1963), p. xi.
[8] Lewis Mumford. “Gentlemen: You Are Mad!” Saturday Review of Literature. March 2, 1946, pp. 5-6.
[9] Lewis Mumford. “From Erewhon to Nowhere.” The New Yorker. Oct. 8, 1960. pp. 180-197.
I’m on vacation for the next two weeks, taking a break from a long stretch of writing and talking about Rewire and related issues. I should be back online around August 10. In the unlikely event you find yourself missing me, here’s the video of a talk and discussion I had about Rewire at Harvard’s Berkman Center last month.
And if you’re in need of more reading material, check out an important new paper from Yochai Benkler, Hal Roberts and other friends at the Berkman Center. The paper uses Media Cloud to analyse the conversation online around SOPA/PIPA and understand agenda-setting, framing and relationships that influenced the debate. My students and I are finishing up a parallel paper at Center for Civic Media using some of the same techniques, and some new techniques, to examine the online debates that helped lead to George Zimmerman’s arrest, which we hope to have out in early fall.
Hope you’re having a great summer.
A história se repete. aliás, a história repete a ficção. a ideia de substituir partes do corpo por outras, artificiais, é tão antiga quanto o domínio das ferramentas, e isso tem quase meio milhão de anos. se óculos existem há séculos, só assumiram sua função e forma atuais há décadas; óculos são sistemas de uso externo ao corpo e sua descoberta e desenvolvimento podem ter ocorrido independentemente, em muitas civilizações, no correr da história.
agora pense em corações artificiais… e não naquelas máquinas que ficam nas salas de cirurgia, como os que vieram da década de 80 e pesavam mais de 200kg, sem falar na eletricidade e tudo o mais ao redor. estamos falando de sistemas que são implantados no corpo e saem por aí, com você, para qualquer lugar, como os da syncardia da imagem abaixo, que você pode ver em detalhe no daily mail.
tais coisas estão se tornando realidade muito rapidamente e vamos ver montes delas nos corpos das pessoas, breve, por aí. só em 2013 já foram implantados 100 modelos syncardia total artificial hearts, contra 125 em todo ano passado. e são mais de 50 centros certificados para fazer tal tipo de procedimento. ou seja, está deixando de ser uma raridade.
em 2000, escrevi um conto sobre uma galera que invadia um coração inteligente, autônomo, muito mais sofisticado do que os syncardia de hoje. e o que isso tem a ver com nossa história? é que esta é a semana da black hat conference em las vegas, evento de segurança de informação que se tornou uma vitrine para mostrar formas de explorar falhas que podem levar a uso não autorizado e potencialmente danoso [ou mesmo letal] de sistemas de informação por quem consiga invadi-los. o evento sempre tem surpresas e será aberto, este ano, por ninguém menos do que o general keith b. alexander, o comandante da NSA [sim, a agência que captura tudo o que é de dados por aí… mas este não é o assunto hoje, aqui]. e qual é?
bem… a black hat havia programado uma palestra [“Implantable Medical Devices: Hacking Humans”] que não se realizará porque barnaby jack, o autor, faleceu. a pessoas próximas, jack anunciou ter descoberto como invadir e controlar marca passos a uma distância de 15m e gerar choques de alta voltagem, talvez letais para o usuário. em 2012, jack havia mostrado como invadir bombas de insulina e gerar níveis letais da substância em diabéticos. o hacker era especialista em sistemas e implantes médicos, área onde segurança é um problema cada vez mais sério, não porque o mundo esteja cheio de assassinos em potencial procurando alguém com um implante inseguro para eliminar, mas porque erros, interferências e usos não previstos ou programados de tais dispositivos podem ter impacto letal.
em homenagem a jack, a black hat não vai substituir sua palestra no programa. e, na rede, as teorias da conspiração sobre o assunto não são poucas. e talvez não seja para menos. porque o assunto é sério a ponto da FDA [a ANVISA americana] dizer que no último ano, muitos incidentes relacionados a vulnerabilidades de segurança foram relatados, envolvendo centenas de dispositivos de dezenas de fabricantes. ou seja, o tema –e o conteúdo, que agora nunca veremos- da palestra de jack não é brincadeira de gente sem ter mais o que fazer e tem que ser tratado: à medida que TICs entram –literalmente- nos nossos corpos, conviver com certas falhas de segurança vai se tornar inaceitável, porque elas podem ser definitivas. não dá, aqui e ali, pra dar um reboot no meu e no seu corpo… dá?
e o texto de 2000? pois é… a história que estamos contando aqui repete a ficção de 2000, que se chamava OS INVASORES e começava assim:
Um dia, teve uma dor de matar, aliás de quase. Acordou no hospital sentindo o peito e tendo um médico à cabeceira, perguntando, pausadamente, se estava a se sentir bem. O sim, meio trêmulo, demorou. Mas saiu. Recuperado, aprendeu que tinham reconstruído seu coração, agora composto, em parte, por um dos novos modelos IntelliBeat, que já incorporava um pequeníssimo servidor web para monitoração, avaliação e controle cardíaco. E isso há três anos.
Depois que se acostumou, a vida ficou normal, ou quase. Pouco se lembrava que seu corpo recebia (às vezes) comandos de um servidor, em algum hospital, e enviava (sempre) bio-dados para a rede. O femto-server instalado no seu coração e a antena pico-cel mais próxima faziam às vezes de sua ligação com a vida, o mantinham no ar, o tempo todo. Como se fosse um rádio na rede. Como a fonte de energia tinha deixado de ser um problema, os novos IntelliBeat eram um sucesso fenomenal: a manutenção e evolução do software podia ser feita, remotamente, sem qualquer tipo de intervenção local, muito menos cirúrgica.
Você nem sentia nada quando mudava de versão.
o texto completo está neste link. monitoramento e supervisão móvel de saúde humana, com dispositivos complexos implantados no [ou associados ao] corpo, deixou de ser uma tendência e se tornou realidade inevitável. e novos níveis de segurança de informação [e de processos de desenvolvimento de software com e para tal] têm que ser estabelecidos, padronizados, tornados mandatórios e mais, deveriam ser parte do processo de certificação de tais sistemas, o que [ainda] não é o caso hoje em dia. mas não há como não ser, e muito breve.
We use ratings for all kinds of services, so let's try scoring the way we use the internet to check on our security and privacy
We are a ratings-obsessed culture. Critics award stars and points for films, books, restaurants, hotels, gadgets and a vast number of other products and services. Sometimes, a professional critic does the scoring, but increasingly, the public collectively creates averaged scores. Some ratings gathered with great detail, such as wine critics' commonly used 100pt scales. Others are as simple as Facebook's "like" button, purely a measure of marketing prowess.
Some of the most useful ratings combine verifiable metrics. Consumer Reports, which has been an essential part of my reading for many years, comes up with what I consider highly trustworthy scores for automobiles and other products. A car goes through a variety of tests; the magazine then weights them according to its longstanding practices and comes up with a total score on a 100pt scale.
NGOs and thinktanks take this approach to measuring such imponderables as economic and press freedom, scoring countries around the globe for policies that aid or restrict such things. As with auto ratings, the results depend on the criteria, and there's plenty of debate about what the surveyors decide is important.
I've been wondering if we could create a system of this sort to gauge our liberty in the technology and communications ecosystem. My goal is a fill-in-the-blanks online form that people could use to: a) say what gear and services they use; and then, b) give them a "liberty scorecard".
The more I've explored this idea, the less sure I've become that it's doable – but that only convinces me it's worth trying. The problem, as we see with all such efforts, is that topic is loaded with complexity, in part because technology and communications presents so many kinds of trade-offs – convenience, cost, security, privacy – in the choices we make. (Trade-offs of this sort exist in all other fields as well; we could make a perfectly safe car if we were willing, and able, to pay ridiculous amounts of money for it.) Moreover, the rise of centralized online services has brought about vast choices within certain domains, but at the cost of ceding an enormous amount of control to them.
Several examples: Apple is highly restrictive with its iOS mobile ecosystem, but the devices are super-convenient to use. The GNU/Linux operating system offers the ultimate in flexibility, but isn't as convenient or easy. Microsoft's Skype is easy and convenient, but not secure from government spies who are vacuuming up all kinds of communications. Running your own mail server is safer in some respects from government dragnets than Hotmail, but it's a pain setting up and maintaining it. Netflix has zillions of movies and TV shows on its streaming system, but uses heavy-duty digital locks to ensure that you can only watch when you're online (and stores vast amounts of data about everything you do when you're using it). You get the idea.
There are degrees of safety, too: you have to decide what are the likely threats. If you set up an unsecured WiFi network at home and use that for your main connection (you shouldn't), you need to use encryption for your online activities. This is the same notion as putting a lock on your front door, to deter other people from casually wandering around your house when you're away. If you want more security, from more serious threats, you have to do much more. (Insurance companies have scores they keep for various ways you secure your home and valuables from theft and fire, and your premiums reflect that.)
The kind of threat makes a huge difference. Skype is insecure if a government is after you, but if I'm chatting with my spouse when one of us is traveling, I'm not going to worry about it.
In thinking about a tech-liberty scorecard, the topic of a talk I gave this week at the O'Reilly Open Source Conference in Portland, I've opted for simplicity – fully aware that my lack of nuance makes any such model deeply flawed. (As expected, I got a bunch of excellent suggestions from the audience on how to improve this project.) Moreover, my scoring system is loaded with value judgments, which create their own collection of problems.
How simple am I going to be with this, at least at the outset? I'm giving myself 2pt for extra effort to retain my independence from those who'd restrict it and protect my communications from those who'd spy on them. I'm giving 1pt for following what should be standard precautions. Minus 1pt for subscribing to services (like most telecommunications carriers) that have long records of cozying up to government. Minus 2pt for stupidity and laziness (for example, easy-to-guess passwords on sites involving important personal data), and for unavoidable losses of privacy and control (hello, NSA). Zero for everything else.
So, by these scores, I might get +2pt for installing Linux on my main computer; +1pt for always using a VPN; +1pt for religiously keeping my software up to date; -1pt for subscribing to Comcast's internet service; -2pt for NSA dragnet surveillance that is happening no matter what else I do; zero for using the centralized Twitter and Google+ services; and so on. You may disagree with some of these assessments, of course.
If I proceed with this approach, I'll make clear that the goal isn't to give people a grade that defines their tech liberty in any serious way. Rather, I'll hope it's food for thought – and the start of a debate folks should have with each other, the companies they patronize and the governments that rule them. It's a conversation we all need to have.
In the fall of 2008 -- just after many of the nation's largest financial institutions teetered toward collapse, prompting the government to unleash a taxpayer-financed rescue -- I called Larry Summers at his Harvard office to ask him whether he had any regrets.
Specifically, I wanted to know how Summers had come to view his actions as Treasury secretary in the Clinton administration, where he had joined then-Federal Reserve Chairman Alan Greenspan to dismantle the government's authority to regulate trading in derivatives -- the very financial instruments then playing a central role in the crisis.
Summers immediately took charge, barking that we were off the record -- a directive that I rejected, prompting him to raise his voice. He accused me of conducting a "jihad" aimed at unfairly implicating him as a cause of the financial crisis.
I promised to call him again before my piece ran, giving him time to reflect. I left messages but didn't hear back, so I left one more, reminding him of my previous calls. When he finally called, his legendary condescension was fully engaged.
"The probability that you left me a message that I did not receive is approximately zero," he said. When it turned out that his secretary had been mixed up about the date of my messages (or maybe it was Larry who was mixed up?), he turned on her, criticizing her sharply with me on the line.
There are worse things in life than terrible phone manners, imperiousness and excessive confidence, but …
Rubin called three members of the search committee who had particular doubts …. It was true, Rubin admitted, that Summers had once had what Rubin would call "a rough edges" issue. But he'd mellowed, Rubin insisted. This was a man who'd successfully negotiated with congressional leaders and foreign treasurers, who'd survived and prospered for a decade in a viciously partisan Washington environment. His temper existed more in legend than in reality. Rubin's seal of approval worked. "Rubin made us confident that we weren't getting a bull," one member of the committee later said.I wonder, is it Rubin again who is telling Obama, "Don't worry, he's changed"? Scheiber goes on.
As for the other key criticism of Summers—that he doesn’t play well with others, something that’s central to making the Fed work—the White House suggestion that it, too, is “outdated” strikes me as delusional or willfully ignorant. … Summers clashed constantly with fellow administration officials, most famously budget director Peter Orszag and White House economist Christie Romer. Often it was about matters of national urgency, and so a little heat could be forgiven. But all too frequently it arose from pure pettiness and immaturity. One example:
About six months into the administration, [Summers] and Orszag were scheduled to join the vice president at a White House event. When Orszag arrived, a body man seated him next to Biden, only to return a few minutes later and ask him to move. Summers had insisted on taking the seat even though it was assigned to Orszag. “I’m really sorry. We had a seating chart. But Larry walked in and saw that you were sitting next to the vice president,” the aide said. [Orszag agreed to move; a third administration official who was present confirmed this account.]
It wasn’t just running antagonists like Orszag and Romer that felt the impact of Summers’ rough edges. The origins of Rahm Emanuel’s ill-fated attempt to procure a personal car for Summers date back to a day in the fall of 2009, when Summers groused to the then-chief of staff: “Life sucks around here. I waited thirty minutes outside the Capitol because you fuckers can’t get your motor pool to work right.”
And then there were the senior Treasury officials whom Summers managed to alienate, like Lee Sachs (Geithner’s top financial consigliere during the first two years of the administration) and Matthew Kabaker, widely regarded as one of Treasury’s brightest stars at the time. Summers seemed to delight in pummeling these two during their hours-long debates over how to respond to the financial crisis. Now, as I say, these were consequential debates, so you’d expect some flashes of emotion. But Summers favored a distinctly unbecoming approach—including a need to taunt his sparring partners as they went back and forth. “Lee, you’re losing this argument!” Summers would thunder. “You’re getting crushed!”
I remember, when the Harvard faculty spoke of Summers' failings of common civility, they were repeatedly ridiculed in the press as pantywaists, frail things who bruise like the princess with the pea. So what if he is a bull in a china shop; who wants to live in a china shop?… Fed chairman simply isn’t the right job for someone with his shortcomings. … In the end, I don’t think Obama is doing Summers any favors by pushing him for it. The longer the White House holds him up as the top candidate, the more damaging it will be to Summers himself, because the list of drawbacks really is quite lengthy.
Caroline Hoxby, one of two tenured women in economics, the department where Summers teaches, spoke of the ties between scholars, their mentors, and students as a "a great shimmering web" when a university is functioning at its best.
"Every time, Mr. President, you show a lack of respect for a faculty member's intellectual expertise, you break ties in our web. Every time you humiliate or silence a faculty member, you break ties in our web," Hoxby said. "When you engage in speech that harms the university's ability to foster scholarship and that is not thoughtful, not deliberate, and not grounded in deep knowledge, you break ties by the hundreds."Right after that meeting Summers experienced an epiphany. "I am determined to set a different tone. I pledge to you that I will seek to listen more, and more carefully, and to temper my words and actions in ways that convey respect and help us work together more harmoniously."
by Harry Lewis (noreply@blogger.com) at July 26, 2013 01:35 AM
The failure to comply with a records request for email metadata will cost a Washington city more than half a million dollars in statutory and attorney's fees, a Washington Superior Court judge recently decided. On June 28, 2013, the judge ordered the City of Shoreline to pay $538,555 after the Washington Supreme Court ruled that metadata associated with public records is subject to disclosure under the state's open records law.
O'Neill v. City of Shoreline arose after Beth O'Neill requested a copy of an email and its accompanying metadata to find out its original sender. The email, criticizing the city council, had been read aloud at one of the council's meetings and wrongly attributed to O'Neill. Although Shoreline provided O'Neill with a paper copy of the email and metadata from forwarded copies of the email, it never provided O'Neill's specific records request -- that of the metadata associated with the original email, including the sender and recipient information. The recipient, Shoreline Deputy Mayor Maggie Fimia, said she must have inadvertently destroyed it, as Shoreline was unable to find the deleted email and its associated metadata in her email folder. After the Washington Supreme Court found that the metadata was a public record, the court remanded the case to the trial court, granting Shoreline the opportunity to inspect Fimia's home computer hard drive for the metadata "only because Fimia used her personal computer for city business." The trial court, however, found that Shoreline had failed to conduct an adequate search of the hard drive, "resulting in the permanent loss of the requested public record," and that Shoreline and Fimia "violated their statutory duty to provide Plaintiffs the fullest assistance in handling their records requests."
In determining that metadata fell within the scope of public records laws, the Washington Supreme Court noted the importance of public access to information that relates to government conduct. It stated, "Our broad [public records statute] exists to ensure that the public maintains control over their government, and we will not deny our citizenry access to a whole class of possibly important government information."
With the ruling, Washington became the second state to explicitly establish metadata as within the ambit of state records laws. Arizona did so in a 2009 case, Lake v. City of Phoenix, in which the court overruled a denial of a demoted police officer's request for access to metadata regarding his performance reports, which he believed had been altered after he had reported "serious police misconduct" to his superiors. The Arizona Supreme Court noted that "[w]hen a public officer uses a computer to make a public record, the metadata forms part of the document as much as the words on the page." The court called it illogical and contrary to the openness policy of the law to "conclude that public entities can withhold information embedded in an electronic document . . . while they would be required to produce the same information if it were written manually on a paper public record."
Other states have been increasingly addressing this issue as well, although not yet at binding statewide levels. A New York appellate court and a Philadelphia trial court both found metadata to be requestable records, and a North Carolina department has adopted best practices guidelines regarding the nature and disposition of metadata. The guidelines explicitly mention the probability of metadata's inclusion under the state's broad definition of a public record.
As the states no doubt recognize, metadata plays a significant role in many contexts. In 2012, fugitive John McAfee's location was revealed after journalists posted a picture of him on their website without first turning off the location services on the phone used to take the photo, leaving his longitude and latitude available for the public to see. In addition to location tracking, metadata has the ability to uncover significant information about who is trying to influence public discussion. This was highlighted when metadata from an anonymously submitted PDF revealed that Google had authored the 38-page feedback letter opposing eBay Australia's decision to use only PayPal to accept auction payments. The disclosure showed that the feedback was Google's reaction to the potential exclusion of the use of Google Checkout, rather than a concerned citizen's impression of the company's decision. Metadata's use, however, is not limited to isolated data, such as that from a single document or photo. It can also reveal relationships. A recent MIT project shows the capacity for metadata -- for example, from phone calls or emails -- to uncover personal relationships, whether those relationships are public knowledge or secret.
We know that metadata is powerful, but what exactly is it? Metadata can mean a lot of things, and its various definitions can dictate a wide breadth of outcomes. Generally speaking, metadata has been defined as "data about data." It can include a record's date, location, or creator; the device on which a record was created; the duration of phone calls or web browsing; and much more. The metadata from a Twitter account alone can include multiple personal identifiers, as the Guardian's metadata guide illustrates:
• the user's name, location, language, profile bio information and URL,
• when the account was created,
• usernames and unique identifiers,
• a tweet's location, date, time and timezone,
• a tweet's unique ID and ID of tweet replied to,
• contributor IDs,
• followers, following and favorite counts,
• verification statuses, and
• the application sending the tweet.
With the amount of information embedded in just one account, the intelligence to be gained from sources' metadata is substantial -- as metadata attaches to many services the public uses daily, such as email, phones, word processing documents, PDFs, and online search requests.
The scope of information metadata encompasses and its inconsistent definitions, however, can be problematic in the context of open records laws. Generally, records requesters must describe the records sought with "reasonable specificity." Without a clear definition of what constitutes metadata, people may ask for information rather than identifying a specific record. This makes it difficult to pin down an actual document; because governmental bodies are not compelled to create records that don't already exist, they can deny a request that doesn't sufficiently describe a particular existing record. Additionally, at least one court has found that a record's attached metadata is not inherently part of that record, so any request for a specific document must explicitly include a separate request for metadata if it is desired.
Similarly, many states only require governmental entities to provide a record in the original format in which the entity stores it. Therefore, metadata that may have been attached to a document at some point may not be available if the government does not maintain the document in electronic form. Yet, even if an entity does include and provide electronic data as records, it is unclear how far a metadata request can reach. With the breadth of metadata definitions, it could even be interpreted as allowing a person to request an entire database, an issue the Utah Court of Appeals faced last year but did not ultimately decide.
The goals of open records laws should play a meaningful role when entities determine whether to release metadata attached to public documents. State records statutes frequently exclude drafts from public access so as not to stifle ideas and revisions throughout the legislative process. As metadata can clearly indicate who contributed to what drafts and at what time, records custodians will need to thoroughly evaluate what metadata should remain private to preserve this purpose, which may be a difficult call to make. Custodians will also need to watch out for other metadata that potentially falls outside the scope of records accessible to the public -- such as metadata comprising personal information (such as medical information) that does not appear in the primary record.
With potentially difficult decisions to be made, states can help alleviate confusion about metadata in several ways. The ambiguity surrounding metadata's definition and exactly when -- or if -- it meets the public record classification should prompt legislators to clearly define what constitutes metadata and implement procedures to regulate its collection and preservation. States could explicitly state that metadata is a form of "electronic data," which is often included under statutory definitions of a "record." Further, states that haven't already done so could require that all government records be maintained in their original form, thus preserving any metadata for future requests. Such clarifications would improve the efficiency and understanding of metadata as government records, further promoting the public interest purpose of open records laws.
Rebekah Bradway is an intern at the Digital Media Law Project and a rising 3L at the S.J. Quinney College of Law at the University of Utah.
(Photo courtesy of Flickr user Richard Akerman pursuant to a Creative Commons CC BY-NC 2.0 license.)
"If a branded school is unable to persuade its students to pay their market fees, then it suggests that the brand may not be so attractive after all," said [a government official].Still, others are staying, and the Yale-NUS campus is set to open next month. It will be interesting to see whether there are enough environmental nutrients (that is, Singaporean and American dollars, and Asian students) to keep them alive. So far, none of the closures seem to be related to the issues that deeply concern the Yale faculty: how to teach in the spirit of open inquiry and free speech in a place where you can be jailed for criticizing the government (or for homosexuality, or a variety of other things that are not constraints in American universities).
by Harry Lewis (noreply@blogger.com) at July 25, 2013 06:04 PM
The Media Cloud team is pleased to announce the release of a new paper, Social Mobilization and the Networked Public Sphere: Mapping the SOPA-PIPA Debate, along with a set of interactive maps that cover selected weeks in the COICA-SOPA-PIPA controversy.
In this paper, we use a new set of online research tools to develop a detailed study of the public debate over proposed legislation in the United States that was designed to give prosecutors and copyright holders new tools to pursue suspected online copyright violations. Our study applies a mixed-methods approach by combining text and link analysis with human coding and informal interviews to map the evolution of the controversy over time and to analyze the mobilization, roles, and interactions of various actors.
This novel, data-driven perspective on the dynamics of the networked public sphere supports an optimistic view of the potential for networked democratic participation, and offers a view of a vibrant, diverse, and decentralized networked public sphere that exhibited broad participation, leveraged topical expertise, and focused public sentiment to shape national public policy.
We also offer an interactive visualization that maps the evolution of a public controversy by collecting time slices of thousands of sources, then using link analysis to assess the progress of the debate over time. We used the Media Cloud platform to depict media sources (“nodes”, which appear as circles on the map with different colors denoting different media types). This visualization tracks media sources and their linkages within discrete time slices and allows users to zoom into the controversy to see which entities are present in the debate during a given period as well as who is linking to whom at any point in time.
The authors wish to thank the Ford Foundation and the Open Society Foundation for their generous support of this research and of the development of the Media Cloud platform.
Media Cloud, a joint project of the Berkman Center for Internet & Society at Harvard University and the Center for Civic Media at MIT, is an open source, open data platform that allows researchers to answer complex quantitative and qualitative questions about the content of online media. Using Media Cloud, academic researchers, journalism critics, and interested citizens can examine what media sources cover which stories, what language different media outlets use in conjunction with different stories, and how stories spread from one media outlet to another. We encourage interested readers to explore Media Cloud.
The Berkman Center for Internet & Society is pleased to announce the release of a new publication from the Media Cloud project,
Social Mobilization and the Networked Public Sphere: Mapping the
SOPA-PIPA Debate, authored by Yochai Benkler, Hal Roberts, Rob Faris, Alicia Solow-Niederman, and Bruce Etling.
In this paper, we use a new set of online research tools to develop a detailed study of the public debate over proposed legislation in the United States that was designed to give prosecutors and copyright holders new tools to pursue suspected online copyright violations. Our study applies a mixed-methods approach by combining text and link analysis with human coding and informal interviews to map the evolution of the controversy over time and to analyze the mobilization, roles, and interactions of various actors.
This novel, data-driven perspective on the dynamics of the networked public sphere supports an optimistic view of the potential for networked democratic participation, and offers a view of a vibrant, diverse, and decentralized networked public sphere that exhibited broad participation, leveraged topical expertise, and focused public sentiment to shape national public policy.
We also offer an interactive visualization that maps the evolution of the public controversy by collecting time slices of thousands of sources, then using link analysis to assess the progress of the debate over time. We used the Media Cloud platform to depict media sources (“nodes”, which appear as circles on the map with different colors denoting different media types). This visualization tracks media sources and their linkages within discrete time slices and allows users to zoom into the controversy to see which entities are present in the debate during a given period as well as who is linking to whom at any point in time.
The authors wish to thank the Ford Foundation and the Open Society Foundation for their generous support of this research and of the development of the Media Cloud platform.
Media Cloud, a joint project of the Berkman Center for Internet & Society at Harvard University and the Center for Civic Media at MIT, is an open source, open data platform that allows researchers to answer complex quantitative and qualitative questions about the content of online media. Using Media Cloud, academic researchers, journalism critics, and interested citizens can examine what media sources cover which stories, what language different media outlets use in conjunction with different stories, and how stories spread from one media outlet to another. We encourage interested readers to explore Media Cloud.
From the MediaBerkman blog:
CONTINUE OVER TO MediaBerkman FOR THE AUDIO AND MORE...Revelations of the NSA’s data surveillance efforts have raised serious questions about the ethics and necessity of violating privacy that have been bubbling under the surface for some time.
Efforts to monitor communication are nothing new, but electronically mediated communication has increased the amount of information being shared, and the possibilities for eavesdropping are endless.
But there's a trade off. People tolerate incursions into privacy for greater security or even convenience: health care, transportation, public safety, or any number of web utilities we use on a daily basis.
Bruce Schneier is an author, Berkman fellow, and security technologist. He recently sat down with David Weinberger to talk about the positives and perils of privacy violation.

In case you missed it, I recently published a piece in RBMA mag about the history of the Dembow, a history I’ve been working to tease apart and put together for a looooong time now.
If you’re not familiar with RBMA, it stands for Red Bull Music Academy. And I was pretty happy to be invited to do something there. If you’re unfamiliar, you should get familiar. Red Bull may seem like a strange sponsor for music culture (though they’re been well integrated as a beverage for more than a decade), but they’ve been sponsoring great stuff lately, from hosting a dope cross-cultural soundclash between some of NYC’s top sounds to commissioning some of my favorite writers to produce punchy pieces on all manner of musical topics. (And their lecture series has been full of revelations.) See, e.g.: Noz on the history of hip-hop mixtapes, Rishi Bonneville on Caribbean pirate radio in New York, Jeff Weiss on the cultural history of the airhorn, or this rich recent interview with Kode9. Oh, and don’t miss the pieces helpfully & aptly linked from the bottom of my own contribution: a chat with Steely & Clevie and a piece on the one and only Philip Smart by Rob Kenner.
Thanks to Todd Burns for the keen editing, making things nice and concise. Per usual, I’m going to take the opportunity to use my blog to run an author’s cut, or an unabridged version. A couple missing paragraphs below help flesh out the picture, especially regarding the Afro-Jamaican roots — and, hence, pan-Caribbean / Afrodiasporic resonance — of the dancehall riddim that started it all. A phrase like “Steely & Clevie’s post-Poco riddim” might seem like a slightly cryptic reference without this particular passage (i.e., paragraph #4 below); but maybe people thought I was calling it post-colonial, which is also true.
I’m also happy to report that a forthcoming issue of Wax Poetics will feature an article I wrote entirely about the (once mysterious) origins of reggaeton’s bedrock riddim on the unlikely outpost of Long Island, heavily featuring Boom’s manager Pucho Bustamante (who I interviewed a few years ago on MySpace). Will let you know soon as that one’s ready to read!
For now, head over to RBMA for their slick version, see below for the full monty, & check out this video I whipped up (also at the RBMA site & embedded below) to see & hear how the various versions all relate. If you want to get even more dembow in your ears, there’s lots to find around the web, but here are a couple of mixes I’ve made that focus on it: Dembow Legacies, Dembow Dem.
Without further ado, let’s loop –






In the world of sample-based music, few recordings have enjoyed so active an afterlife as the Dembow. A two-bar loop with unmistakably familiar kicks and snares, it underpins the vast majority of reggaeton tracks as an almost required sonic signpost. Thanks to crossover jams like Lorna’s “Papi Chulo” and Daddy Yankee’s “Gasolina,” the Dembow has spread its distinctive boom-ch-boom-chick to glossy Latin pop, raw electro-chaabi in Egypt, transnational moombahton, and Indonesian dangdut seksi, to name a few.
With such remarkable resonance and staggering frequency of appearance, the Dembow would seem to deserve a place alongside such well-worn loops as the Amen break, the Triggerman, the Tamborzao. All these brief but inspired moments “on tape”—and all of them rolling drum rhythms—after having been sampled and looped and diced and spliced by hundreds and hundreds of digital-age producers, have proven so crucial to the sound of entire genres that they have taken on names, and lives, all their own.
There are a few things, however, that make the Dembow an unusual member of the sample canon. For one, the recording most often identified as the origin of the sample is not actually the source of reggaeton’s favorite loop, not exactly anyway. It’s true that Shabba Ranks’s anti-gay, anti-imperialist anthem “Dem Bow” may as well be patient zero for the infectious rhythm that still carries the song’s name, but samples of the track accompanying Shabba—the riddim in reggae parlance—rarely actually turn up in reggaeton. Jamaican studio duo Steely and Clevie deserve credit for the bouncy beat they boiled down for Bobby Digital, but not as the creators of a intensely re-used sound recording. Rather, their riddim planted the seed that would grow into what we now call Dembow.
Like other popular riddims the duo produced in the early 90s, especially Poco Man Jam (to which Dembow is audibly indebted), the track accompanying Shabba’s rally-cry draws on the deep rhythms associated with Pocomania, a neo-African Jamaican religion with practices and aesthetics that run parallel to other post-slave cultures across the Caribbean. The driving boom-ch-boom-chick that emerges between the steady kick on each beat and the polyrhythmic play of the snares, can also be threaded through rumba, salsa, soca, bachata. It’s at the heart of what’s been called jazz’s “Spanish tinge,” known variously as the cinquillo or the habenera. This may help explain the broad appeal of these particular Jamaican recordings, why Puerto Rican hip-hop producers moved more or less wholesale into making Spanish dancehall, and how reggaeton so quickly swept across dance scenes across the Americas and beyond. Shabba’s “Dem Bow” was a big chune in the wide world of reggae, and not just because of its bullish stance, colorful lyrics, and catchy chorus.
But rather than samples of Steely & Clevie’s riddim resounding from trunks across the Spanish-speaking world, and rather aptly given reggaeton’s transnational roots, the set of sounds most often identified as the Dembow per se (as opposed to just the generalized rhythm which, confusingly, is also sometimes called Dembow), is a version cooked up by Jamaican and Panamanian collaborators laboring on Long Island, NY in the early 90s to create reggae en español anthems—and succeeding.
By the early 90s, Philip Smart’s HC&F studio was the premier spot for producing dancehall hits, Jamaica notwithstanding. A native Kingstonian who apprenticed under King Tubby, Smart moved to New York in the mid-70s and launched HC&F in 1982 enlisting as house musicians such fellow expatriates as Dennis “The Menace” Thompson, the sole musician credited with “Dub Mix II,” better known today as the Dembow riddim, or in Panama, the Pounda. Initially crafted as an instrumental for Panamanian vocalist Nando Boom’s “Ellos Benia,” a close translation of Shabba’s “Dem Bow,” Thompson captured the rhythmic essence of Steely & Clevie’s post-Poco riddim while adding some digital timbales and other touches for extra sabor at the prompt of Ramon “Pucho” Bustamante, the Panamanian manager of Nando Boom who helped engineer the reggae en español movement. The wordless version that would soon play backing track to hundreds of Puerto Rican rap parties was not actually released until two NYC-based Jamaican deejays, Bobo General and Smiley Wonder, recorded their own single over the riddim, “Pounder,” with the dubbed-out instrumental as a quickly coveted B-side. (“A bad custom of the Jamaicans,” Bustamante once told me.)
When instrumental CDs such as Pistas de Reggaeton Famosas include a “Dem Bow” track—and they always include at least one—the track labeled as such is nearly always based on the drums Dennis the Menace laid down for Nando Boom at HC&F. Likewise, do a search for “dembow loop” on YouTube or 4shared, and you’ll hear the same echoes there too. By this point, the instrumental has been looped, compressed, remastered, and reconstituted dozens of times over. But the lineage is audible, and it makes Dennis and company’s Dembow one of a few recordings, like the Funky Drummer or the Apache break, which has provided the basis for hundreds if not thousands of other tracks.
The story of the Dembow and its legacy gets even more complicated, since beyond a relatively small circle of reggaeton producers and connoisseurs, when most people say Dembow, they refer to its rhythm—the boom-ch-boom-chick pattern—more generally. And in practice, reggaeton producers have been chopping up dancehall riddims and recombining them with a greater interest in split-second allusion than faithful reproduction. While wholesale loops of Dembow do sometimes appear, reggaeton drum tracks tend more often to comprise samples drawn from a small storehouse of treasured timbres: a handful of reggae riddims which have animated Spanish-language dancehall for decades. Bam Bam, Fever Pitch, Drum Song, and yes, Dembow, are all common sources, but the ingredients could come from almost anywhere if they sound right. Reggaetoneros swap sample sets like playing cards, and a willy-nilly archive of reconfigurable samples traverses the North and South American Hulkshare-osphere like a reggaeton robotics kit. For lots of listeners and producers, any of the snares from these well-worn riddims, or any snare with similar properties, could suffice to say Dembow.
A line can be drawn from Steely & Clevie, though Smart and Thompson and Bustamante, to what we call Dembow today, but for all that collective, transnational effort, the foundation for this single recording’s remarkable resonance was most crucially fashioned in mid-90s San Juan by proto-reggaeton pioneers like DJ Playero and The Noise. On their seminal underground mixtapes, these Puerto Rican producers took a hip-hop hatchet to dancehall riddims, chopping up favorite drum loops, baselines, and riffs to create dynamic, reference-laden collages of contemporary club beats for local rappers’ double-time, flip-tongue, street-level lyrics. Over the course of Playero 38 or The Noise 6 one hears a constantly shifting bed of beats composed of signature samples from Bam Bam, Fever Pitch, and the like. Dembow was such a staple source that the entire genre for a time, after being known as underground but before reggaeton, was simply called dembow.
Crucially, around the turn of the millennium, the Dembow—and Puerto Rican reggae en español more generally—was transmuted and extended by DJ Blass. With the rise of Fruity Loops and other software, techno-inspired bleeps, presets, and arpeggios could be sutured to Dembow snares for a killer club-ready concoction. Blass’s mixtapes like Sandunguero and Reggaeton Sex changed the sound of what would soon be crowned reggaeton while maintaining important links to predecessors. Namely, by chopping well-worn loops into discrete kicks and snares, Blass could nod to the riddims that dancers, vocalists, and audiences had come to love while shaping the sounds into his own lean patterns. Blass’s influential techniques carry forward into the productions of the duo who finally took reggaeton to the pop charts and the Anglo mainstream, Luny Tunes.
If you listen to the track Luny Tunes produced for their biggest hit, “Gasolina”—or most of their other pistas—you’ll hear snare samples swap every four measures, embodying in their own subtle but audible manner the loop-switching practices of Playero’s proto-reggaeton. Revising the Dembow as something more general, more flexible, and in its way, less Jamaican than it had been, Luny Tunes honored reggaeton’s rhythmic and timbral heritage while opening it up to a new variety of textural, harmonic, and melodic gestures, especially “pan-Latino” sounds. When Wisin y Yandel reprise Shabba’s chorus for their club-friendly, bachata-steeped, Luny Tunes-produced update of “Dem Bow” in 2003, the phrase has little to do with imperialism or sexual orientation and everything to do with the backbone beat and criss-crossing snares that compel people to perreo, or do the doggystyle dance so synonymous with the genre.
In the decade since reggaeton galloped into the mainstream, the Dembow has been Cubanized, Colombified, Peruvinated, watered-down, dressed-up, and recomposed to fit a thousand new contexts. Recently, the rhythm—and to a lesser extent, the riddim—has even made inroads into the more frequently foursquare world of EDM via Dave Nada’s moombahton, where Dembow comes full circle in a strange and surprising way. Nada famously invented moombahton by slowing down Dutch house tracks to please a house of reggaeton-loving teens, but the reason this worked was precisely because Dutch house had itself absorbed Caribbean rhythms via bubbling, a short-lived but influential local club scene clustered around Rotterdam, Amsterdam, and the Hague. Producing personalized soundtracks for dance battles, first- and second-generation kids from Curacao and Suriname made hyperspeed, bricolage remixes of the same dancehall riddims that had Puerto Rican youngsters going nuts across the Atlantic.
Slowed down once again and rebranded as moombahton, Nada’s wildly successful experiment introduced the Dembow to new listeners across the networked world, especially after producers like Rotterdam’s Munchi heard ways to move beyond screwed house remixes and connect the burgeoning genre to its Puerto Rican cousins. Munchi was initially drawn to the genre because of his love of Dembow and reggaeton and the possibilities moombahton offered to revisit these irresistible rhythms: “The idea was so simple,” Munchi wrote to me, describing moombahton as “THE chance for reggaeton to get out of its hole.” Having nearly abandoned the stagnant genre, Munchi noted that “It felt so good that I could make ‘reggaeton’ again.” And while no one would confuse Munchi’s genre-busting work with reggaeton per se, no one could deny the genre’s presence in his tracks.
For his part, Nada himself has occasionally sampled the actual Dembow riddim for his moombahton productions (though he wouldn’t say which ones), but like many others, Nada more often recreates his own Dembow-indebted patterns using a variety of drum sounds and samples. “I’ve used it in the past to help dirty up a few tracks. I’ll mangle the sample and bury it though.”
Moombahton may have already enjoyed its moment in the social media sun, but there are other corners of the so-called global bass scene where that old boom-ch-boom-chick still resounds. “The post-tropical flight from Caribbean percussion at the end of the mini-Moombathon craze has left a large side of EDM dembowless lately,” says Rizzla, whose soca and reggaeton influences help to keep Caribbean polyrhythms in the metropolitan mix. Rizzla trawls 4shared and Hulkshare for Dembow tracks and samples but reports that, “Most of the time I use sampled individual drums and reconstruct a Dembow with variations I make myself.”
Dubbel Dutch describes a similar process for his own productions: “I personally have never sampled the Dembow riddim but have used various rhythmic cousin ‘Dembow’ loops in my productions. Most of these I’ve found via reggaeton sample packs downloaded from 4shared while searching for Mexican tribal and perreo tracks.” Bearing witness to the sonic priorities of digital bass culture, Dutch confesses that, “Admittedly, my awareness of certain loops has even preceded my knowledge of their origins.” Accordingly, he repurposes cherished dancehall loops without being parochial, which actually places him squarely in the reggaeton tradition: “One of my favorite ‘Dembow’ loops comes from the Fever Pitch riddim. That one keeps popping up at various speeds in a lot of my tracks. It manages to work flawlessly at just about any tempo, whether it’s a Dutch bubbling track or an 80 bpm reggaeton beat, which is sort of a rare quality for any loop to have.”
Not unlike their sample-raiding peers in reggaeton, then, producers such as Rizzla, Dubbel Dutch, and Uproot Andy tend toward an inclusive idea of what constitutes the Dembow riddim, complicating simple narratives of a single sample’s afterlife. “I’d say the Fever Pitch (aka Rich Girl) ‘Dembow’ loop is a better possible candidate,” Dubbel Dutch argued, “for an Amen or Think type breakbeat.”
For Uproot Andy, who recently released Worldwide Ting, which he calls “an hour long celebration of the Dembow in all kinds of contexts, some natural and some forced,” even such tributes are necessarily mongrel in their make-up: “The opening track is a song I just made called the ‘Worldwide Dembow’ and it’s sort of an homage to the Dembow rhythm, it samples Pablo Piddy, a Dominican dembow artist, saying ‘si tu quiere dembow,’ and the tune is basically a reimagining of Drum Song riddim (melodically), and Fever Pitch riddim (rhythmically), although it doesn’t actually sample either of them, but pretty much picks apart the elements and recreates them with more synthetic sounds.”
Uproot Andy’s reference to Dominican dembow bring us full circle for this lively, and living, story of a loved loop. No place today can lay stronger claim to bearing the Dembow flame than the Dominican Republic, where a rejuvenated version of San Juan’s proto-reggaeton, in all its referential richness, manages to move kids on the streets (and YouTube) and, increasingly, to move into the pop sphere as well.
In the mixes of DJ Scuff and countrymen—or, say, just about anything in the Dominican dembow Soundcloud group—the Dembow (as such) is on constant, quicksilver rotation with chops and stabs from Bam Bam, Fever Pitch, Poco Man Jam and the like. But once again, enthralled as Dominican dembow may be with such well-worn samples, its restless producers also emulate the voracious and pliant approach of their mid-90s muses, Playero and the Noise. So a classic hip-hop break like Think, or even funk carioca’s Tamborzao, might make it into the mix. But no matter how wide the circle of references, the name of the genre bears witness, at bottom, to the fact that Dominican dembow is built on a commitment to some relatively old riddims and some far older rhythms.
For Linton Kwesi Johnson, the UK-based dub poet and bass culture theorist, the same dancehall riddims so central to the Dembow variations were popular precisely because they can sound at once modern and traditional. “On one hand, this music is totally technological,” he notes, “on the other the rhythms are far more Jamaican: they’re drawn from Etu, Pocomania, Kumina—African-based religious cults who provide the rhythms used by Shabba Ranks or Buju Banton. So despite the extent of the technology being used, the music is becoming even rootsier, with a resonance even for quite old listeners, because it echoes back to what they first heard in rural Jamaica.”
Uproot Andy offers a similar take: “If reggaeton took the rhythm and ran with it, Dominican dembow brings it strictly back to the roots.”
Here’s what you’re seeing/hearing in the video above:
first, shabba ranks’s “dem bow” produced by steely & clevie (for bobby digital)
then, nando boom’s “ellos benia” produced by dennis the menace (for philip smart & pucho bustamante)
then, the instrumental of the boom track, released as “dub mix II” on b-side of “pounder” by bobo general & sleepy wonder
then, a commonly circulating version of the dembow riddim (“original”), audibly related to the dennis the menace instrumental, if a bit beefed up and boiled down
finally, a return to “dub mix II” to hear how dennis the menace added subtle dub effects to his track — sounds which never turn up in reggaeton productions because of the way the loop circulates as a digital (re)sample rather than a vinyl b-side
For families that are sending their teens to college this fall, summer is full of anticipation and planning. There’s so much to buy and pack and think about that sometimes parents forget to think about something really important: their health.
As parents, we are in charge of our child’s health—their diet, their exercise, their medications and what happens when they get sick. But when teens leave home, we need to be sure they can handle these things, and make good decisions, by themselves. Not that we can’t help out. I get lots of phone calls from my college-aged kids about health stuff, but it’s different when we’re not right there. Besides, this is something young adults need to learn to manage.
If your child has a chronic medical problem, it’s particularly important to make an appointment with your doctor and talk about how best to manage this. You’ll want to have copies of important records, and have some conversations ahead of time with a medical person at or near college, so that your child can get the care he or she needs It’s also important to have conversations with your child about what to do and when. Your doctor can help with all of this, including deciding whether you should contact a local specialist.
Whether or not your child is usually healthy, here are some tips:
Make sure your teen can get health care at college. There are two aspects of this:
If your child takes prescription medication(s), plan ahead. See if you can get a 90-day supply through your insurance. If not, you will need to find a way to either get medication to your child or find someone to prescribe it at college. This is not something you want to figure out when there’s one pill left.
Shop for supplies. When your teen goes to college, he leaves the bathroom cabinet behind. Here are a few things you should pack in a kit (along with hand sanitizer):
Make some diet and exercise plans. The Freshman 15 is common for all sorts of reasons—including the fact that parents aren’t around to provide and supervise healthy meals and make sure that teens are getting some exercise. Talk about foods they can and should eat and help them make a plan for when and how to exercise too. Being proactive can make all the difference.
Talk about risky behaviors. We all want to think that there is no way our teens will drink, have sex or do anything else like that at college. But the reality is that lots of them do. You can help keep your teen healthy by:
Check in with your doctor. Your child may or may or may not need a visit, but check to see if there are any vaccines (such as a Menactra booster) or prescription refills or other things that need taking care of before he leaves. Call now, so you’re not scrambling to get an appointment the day before you have to be there!
For more information, check out “Healthy Tips for the College Freshman” from the American Academy of Pediatrics.
This post is part of a series of posts featuring the stories from our STEM Story Project.

In the fall of 1902, twelve robust young men in suits gather in the basement of a government building in Washington, D.C. Waiters serve them dinner on fine china, prepared by chefs–courses like chipped beef, turnips, celery on toast, and applesauce. The men eat what they’re served, even though they know that their food is poisoned. They do this every day, three square meals a day, for months.
This is the story of the Poison Squad, an experiment that begins in that basement dining room and continues on our dinner plates today.
Harvey Washington Wiley is the mastermind behind this experiment. Before you condemn him, you’d be surprised to know that you probably owe him a debt of gratitude. Incidentally, Wiley is the founding father of the Food and Drug Administration. The intention of these experiments was not to induce digestive discomfort for its own sake. Rather, they were part of an extensive study on how chemical preservatives in food–before regulations existed–could harm human beings over time. You might cringe at what was once used to keep food “fresh.”
PRX STEM Story Project producer Sruthi Pinnamaneni gave us a closer look inside the story, beyond her radio piece. About diving deep into archival materials, she says,
“I spent hours [at the Library of Congress], reading thousands of [Wiley's] letters and squinting at his tiny journals. It is when you know every curve and squiggle of a man’s handwriting that you feel as though you’re starting to get to know him!”
One surprising fact that she discovered while researching the piece was that while Wiley’s experiments contributed so much to food regulation, today’s practices still leave something to be desired:
“…The FDA doesn’t really test food additives anymore. There are more than five thousand additives commonly found in processed food and most of them haven’t been tested on animals and almost none (except for dietary supplements) have been tested on humans.”
Sruthi sent us some photographs of the Poison Squad, Wiley, and some (how shall I put this?) unconventional tools that were used during the experiments.
“None but the brave can eat the fare.” Are you brave enough? Full serving of intrigue and radio in this piece. Bon appetit.
All photos: FDA
Listen to all the other PRX STEM Story Project pieces.
Last month, we explored the very real environmental costs of Internet services, particularly those raised by the growing trend of moving data processing and storage into “the cloud.” Since then, the energy and carbon costs of the Internet and cloud computing have been a hot topic. Some IT companies have been capitalizing on the concern as well. Last month, Microsoft began touting its Internet Explorer 10 browser as the most energy efficient browser available, estimating that if every Google Chrome and Mozilla Firefox user in the US switched to IE 10 for a year, the energy saved could power 10,000 households for a year. EBay has taken things a step further and is publicly disclosing its energy usage with an impressive online dashboard.
In another exciting development, researchers at the University of Illinois at Urbana-Champaign are hoping to combat e-waste. Project leader John Rogers, a material scientist, recently gave details on the project, which is hoping to develop circuit boards that safely decompose when exposed to water. In his report, Rogers suggests consumer “dissolving” electronics are on the horizon.
Additionally, this week the US Department of Energy announced plans to establish minimum efficiency standards for all servers and computers sold in the United States.
Perhaps the most significant development over the past month is the publication of a study on the efficiency of data servers done by Jonathan Koomey at Stanford University’s Steyer-Taylor Center for Energy Policy and Finance. Koomey’s study found that that larger companies like Google, Facebook, Amazon, and eBay have been working toward building sustainable data centers. However, the study also found significant waste from organizations whose servers are used by media companies, government, universities, and airlines. The study concluded that many servers could be easily and cheaply modified to use up to 80% less energy, but that a major obstacle to implementing these changes is the gap between who produces and installs the technologies and who is responsible for paying the electric bill. Koomey summarized the gap by asking “Who designs and builds your cable box? The cable company. Who pays the electric bill? You do. So, you end up with a cat warmer on your shelf.”
Another problem the study identifies is the tendency of policymakers and environmental organizations to focus on using renewable energy instead of improving the efficiency of current data centers. Koomey suggests institutions make their centers efficient first, and then worry about switching to renewable energy sources after. This will have the added benefit of making the switch easier and more practical, since renewable energy typically produces less power than traditional sources. In other words, fix the leaky pipes before worrying about making a more efficient pump. The good news is that once these institutional issues are addressed, many data servers can be made more efficient using off-the-shelf equipment and simple management strategies.
Mr. Keating found those involved in the searches to have acted in good faith and with a guiding desire to safeguard the confidentiality of the Ad Board process. That aspect of his report is reassuring. His detailed account of how these searches were done, however, makes it even clearer than before that there is much work ahead in improving the University's policies and protocols concerning privacy of, and access to, electronic communications.President Faust's statement is even stronger:
Unfortunately, the detailed factual account in Mr. Keating’s report deepens my already substantial concerns about troubling failures of both policy and execution. The findings strengthen my view that we need much clearer, better, and more widely understood policies and protocols in place to honor the important privacy interests that we should exercise the utmost vigilance to uphold. A university must set a very high bar in its dedication to principles of privacy and of free speech; these are fundamental and defining values of our academic community. The searches carried out last fall fell short of these standards, and we must work to ensure that this never occurs again.And in the Globe, Lee said very much the same thing:
"Hopefully, it will never happen again,” William F. Lee, a member of the Harvard Corporation, said in an interview. …. Even with “the unprecedented nature of the events, the urgency of the events, the fact that students’ privacy and individual rights were involved, it’s clear that the policies that we as a university had in place were inadequate to the task,” Lee continued.I am very glad to see such a consensus around the need for better and more exacting policies and practices for electronic searches of Harvard communications. On page 20, Mr. Keating's report notes two existing policy statements. One is plainly applicable only to students. The other is in the "Information for Faculty" handbook, and reads as follows:
The unauthorized examination of information stored on a computer system or sent electronically over a network is a breach of academic and community standards. Authorized system support staff however, may gain access to users’ data or programs when it is necessary to maintain or prevent harm to the University, its computer systems or the network.For the life of me, I can't remember ever seeing that. The Information for Faculty is not voted by the Faculty, and I do not know who wrote that paragraph or when this language entered the handbook. It is a document professors are expected to know and obey, but since it is not faculty legislation, we are unlikely to notice changes year to year unless a dean points them out. Nor do I know by what authority language like this gets added.
Electronic files, e-mail, data files, images, software and voice mail may be accessed at any time by management or by other authorized personnel for any business purpose. Access may be requested and arranged through the system(s) user, however, this is not required.The report does, however, dismiss the FAS Faculty email privacy policy as basically nonexistent, and in fact unknown to the Office of General Counsel. Footnote 3 on page 20 states:
The OGC Attorney did not review a policy, entitled “FAS Policy Regarding the Privacy of Faculty Electronic Materials,” which was posted on an Information Security & Privacy website at <http://security.harvard.edu>, rather than the FAS website. That policy had never been approved by the OGC, and in 2012, no attorney at the OGC appears to have been aware of its existence. Also, neither the HUIT Employee nor the FAS administrators who approved the searches knew about it. In 2005, the FAS consulted with the OGC concerning a proposed privacy policy, but the OGC was never advised that the policy had been adopted by anyone or posted to this particular website. Also, as a matter of practice, the OGC is not asked to “approve” privacy policies adopted by different institutions at the University. Further, when the OGC is asked for guidance about whether particular actions comply with applicable policies, it focuses on the policies posted to the website for the relevant institution, such as the FAS website, or printed in the relevant handbook.
I raised this question with Richard for the I/T committee a few days ago. As I have learned a little more, thanks to conversations with [Harvard University Chief Information Officer] Dan Moriarty and with Hal Abelson of MIT, I'm including Richard in this followup for him, first contact for you.
Following the resignation of the Boeing president because of emails showing he was having an affair with an employee, a student asked me, "Who can read my email?" I'd generalize that to who can read faculty, staff, or student email. It's the policy question in which we are interested, not the reassurances about how well your staff are trained (though god knows that is probably the most important thing as a practical matter!).
Here is what I can figure out. If there are pieces I am missing I'd appreciate knowing that.
I wrote the privacy policy for students, which was, I think, voted by FAS before it was put in the Student Handbook. HASCS has put that language in its own policies in a place that suggests they apply to all FAS systems and users. I wonder if that broadening was done by you or by you and some FAS dean or committee. Anyway it's a good thing.
At the university level there is a personnel manual item that says you have no privacy at all - Harvard can read your email for any "business" purpose. I am uncertain whether this personnel manual applies to faculty or just staff. In any case it is pretty amazing to read - I am sure it conforms to the law but it is certainly very far from community expectations today. (And even more amazing that you have to go through PIN authentication even to read that you have no privacy!) And even if it does not apply to faculty but does apply to staff, would university policy trump FAS policy for a staff member if there were a request from the Center?
MIT has a policy which looks broader, simpler and more reassuring, though I am not sure it really is since the ball gets kicked to a VP to decide. I have an inquiry into that VP to find out what he thinks that means.
I am attaching a collection of snippets of these various policies. If I am missing any, do let me know. I really wonder if it makes any sense to set these policies separately in 10 Faculties given the way computers are networked, with a potential of 30 different policies (for faculty, staff, and students) or more (if the Center has its own view, which it may well regard as trumping any policies adopted within a Faculty).
By the way, my question has nothing to do with the (to me) obvious fact that the university will comply with any subpoena or court order. The question is just what the university can do for its own purposes by way of reading email of users of its systems. I should also say that some of these policies made more sense once than they now do. The staff policy in the personnel manual doubtless reflected (a) email storage as a scarce resource and (b) OGC's insistence on giving no more assurance of privacy than the law would require, which was not very much on business machines.This email, which sounds hauntingly familiar in places (really? why not have 10 policies for 10 Faculties?), started a thread of discussion in the FAS I/T committee and elsewhere. I can't really say what happened next, but the minutes of the June 2, 2006 meeting of the FAS I/T committee include this item:
5. E-mail privacy policy Dan Moriarty reported that, after extended consultations andnegotiations, a new policy regarding the privacy of faculty e-mailhas been written and approved. This policy, which is attached(email_privacy.txt), for the first time affirms that e-mailmessages and electronic documents on Harvard computers areconsidered confidential and will be accessed only in specific andextraordinary circumstances. Note, however, that the new policyapplies only to faculty, not to staff or students, for whomexisting policies remain in place. Members expressed gratitude toDan and his colleagues for their work in providing greaterprotection of electronic materials developed and stored by faculty.The minutes don't say who the consultations and negotiations had been with, but intervening exchanges indicate that the Office of General Counsel was, reluctantly perhaps, involved in the discussion. The attachment included the text of the Faculty policy referred to in Footnote 3:
New Policy Regarding the Privacy of Faculty Electronic Materials
The Faculty of Arts and Sciences provides the members of itsfaculty with computers, access to a computer network and computingservices for business purposes, and it is expected that theseresources will be used in an appropriate and professional manner.FAS considers faculty email messages and other electronicdocuments stored on Harvard-owned computers to be confidential,and will not access them, except in the following circumstances.
First, IT staff may need access to faculty electronic records inorder to ensure proper functioning of our computer infrastructure.In performing these seervices, IT staff are required to handleprivate information in a professional and appropriate manner, inaccordance with the Harvard Personnel Manual for Administrativeand Professional Staff. The failure to do so constitutes groundsfor disciplinary action.
Second, in extraordinary circumstances such as legal proceedingsand internal Harvard investigations, faculty records may beaccessed and copied by the administration. Such review requiresthe approval of the Dean of the FAS and the Office of the GeneralCounsel. The faculty member is entitled to prior written noticethat his or her records will be reviewed, unless circumstancesmake prior notification impossible, in which case the facultymember will be notified at the earliest possible opportunity.The policy has disappeared (been disappeared, I am tempted to say) from where it used to appear on the Harvard web site:
by Harry Lewis (noreply@blogger.com) at July 23, 2013 07:59 PM
O blog publicou três textos [primeiro, segundo, terceiro] sobre o efeito snowden, discutindo os quês, porquês, comos e as implicações da espionagem digital, em rede, nos EUA e dos EUA sobre o resto do mundo. não é difícil supor que governos de todos os matizes, amigos dos americanos ou não, tenham feito a pergunta que se fez aqui no brasil, literalmente quanto custaria tirar os dados do governo brasileiro de sistemas aos quais a NSA pode ter acesso sem a gente nem saber? o terceiro texto da série foi exatamente sobre isso e se dizia, lá, que o problema é complexo, grande e de [muito] longo prazo. e caro, claro.
semana passada, microsoft e google publicaram dados de sua performance para o segundo trimestre de 2013 e, lá, dá-se uma ideia do tamanho do problema. o CEO da microsoft, steve ballmer, disse que a empresa administra mais de 1 milhão de servidores. são muitos datacenters gigantescos –um deles consome 100MW-h, equivalente ao consumo de uma cidade de 50.000 pessoas no brasil- para prover serviços pessoais e corporativos a toda a rede, no mundo inteiro. e a vasta maioria dos datacenters está nos EUA, por um grande número de razões, inclusive porque o custo de tê-los [por exemplo] no brasil, com performance similar, seria inviável, mesmo se a infraestrutura [básica] de rede existisse, o que não é necessariamente o caso em quase todos os países. e não se trata só de dados e seu armazenamento, mas tráfego: 25% de todas as transações na internet na américa do norte já passam por google, contra 6% há três anos. espionando google, você dá conta de 1/4 de tudo o que acontece na rede do lado norte do continente…
a ideia de manter os dados de brasileiros no brasil, que foi aventada por alguns, é fora de questão. o governo não tem poderes para me dizer onde eu posso [ou não] armazenar meus dados. para tal, teria que fazer o que o irã vem tentando há anos, que é tirar o país da internet global e deixar todos os usuários locais em uma rede nacional, fechada, isolada da internet mundial para quase todos os efeitos. exigir o armazenamento no brasil, por provedores de serviços como google, como quis a presidente numa declaração, só se dará na prática se [como no irã…] o brasil decidir fechar sua rede. porque a alternativa seria proibir faceBook, microsoft e twitter [e todos os outros] de servir os brasileiros até que tivessem datacenters no brasil. mesmo assim… como tal condição seria verificada?… mudar o marco civil para explicitar, nele, que os dados de brasileiros não podem ser espionados é o mesmo que ditar, na constituição, que satélites não podem fotografar o brasil, lá do espaço. será lei… mas não valerá.
o ministro das comunicações acha o contrário e acredita que, por faturar muito no brasil [R$2.5B, em mais de R$100B de faturamento por ano…] a empresa deveria ter todo interesse do mundo em instalar datacenters no brasil. só que não é por aí, ministro; datacenters de google, aqui, estariam tão ou mais expostos quanto os da microsoft nos EUA. o problema é outro: depois de décadas de lado, o problema da criação de capacidades empresariais nacionais, na rede, que se comportem por nossas regras, e não pelas de qualquer outro país, estão bem à nossa frente. pouco vai adiantar reescrever o marco civil para dar conta disso ou criar uma reserva de mercado para serviços nacionais na web. aliás, o problema nem é esse: ao trocar uma mensagem com quem esteja no gmail lá nos EUA, por exemplo, um brasileiro estará tão exposto à vigilância americana como se todos os seus dados estivessem lá, mesmo se houvesse cercados digitais para todos os seus dados aqui. o buraco, diria neném prancha, é muito mais embaixo, é na arquitetura da rede, e não dá pra consertar com discursos… ou com leis nacionais. ou bravatas.
o problema que o executivo pode tratar, e bem, é manter os dados governamentais sob controle estatal. num país de grande porte como o nosso, isso vai custar caro, especialmente se a exigência de performance for de classe mundial. a qualidade de serviços de google resulta de um grande número de fatores, incluindo gastos com infraestrutura. o gráfico a seguir mostra quanto google gasta em servidores e seu contexto, por trimestre; só este ano, já se foram US$2.8 bilhões.
abaixo, a comparação de quanto google, microsoft e faceBook têm de servidores e quando gastaram nos últimos dois anos. há tempos que se sabe que a sociedade da informação está sendo montada sobre uma infraestrutura da informação. em coerência com o baixíssimo investimento em infraestrutura, o brasil também não investiu, como estado, nem criou as condições para que os investimentos fossem feitos pela iniciativa privada, para criar provedores de informação nacionais de classe mundial. pra ser totalmente coerente com o passado, o que falta, agora, é criar uma reserva de mercado para serviços web, uma política de avestruz digital, no rastro do último quarto de século de muitos erros estratégicos em tecnologias de informação e comunicação, mas com ainda menos –ou nenhuma- chance de dar certo.
a saída é pensar e planejar agora para executar no longo prazo, e preparar o brasil para competir –globalmente- no mundo digital. ou será que vamos acabar presos, literalmente, numa rede social provida pelo governo federal?… que também, aliás, encomendaria ao SERPRO uma rede de jogos como STEAM?… o comportamento de boa parte dos brasileiros que jogam na web está lá. sim, e eles pagam por isso, na sua maioria, usando payPal… que aqui seria desenvolvido pela receita federal?
sei não. algo me diz que brasília precisa vir, urgentemente, para o mundo real. nem que seja só para a parte virtual dele…
This week’s picks for station: STEM Story Project pieces of all stripes, Ramadan, and weekly music series.
Want to get weekly station newsletters via email? Subscribe.
|
Southeast Asia
Tougher Internet filtering policies are being applied throughout Southeast Asia. Singapore’s government initiated new rules requiring online news websites to apply for individual licenses and put up a $50,000 bond. The move met with strong response from 150 websites that blacked out their homepages to protest in May, and from 2,000 demonstrators who took to streets in protest. Vietnam has been putting activists and dissidents in jail on specious charges. The country has detained forty-six bloggers and democracy activists so far this year – more than during the whole of 2012—amid erupting strikes and social unrest stirred by inflation, land-rights abuses, and corruption. Thailand has also clamped down on the Internet, strengthening Internet censorship: 20,978 URLs were blocked last year, compared to just 5,078 in 2011.
Gambia
The Gambia House of Representatives has enacted a new law banning criticism and derogatory content towards government officials on the Internet. The Information and Communication Bill 2013 puts stringent punishments in place for those who violate the law: up to 15 years in prison, a fine of up to three million Dalasi (about 100,000 US dollars), or both. The law targets any person found to be spreading false news or derogatory statements against the government or any public officials. The bill seeks to provide deterrent punishment of people who are engaged in campaigns against the government both in and out of the country, according to Nana Grey-Johnson, the Gambia’s Minister of Information, Communication and Information Infrastructure. Human rights groups say the new law takes the restriction of freedom of expression in the Gambia to “a shocking new level”.
Russia
Russia has been pushing new legislation that allows copyright holders to ask courts to block access to allegedly pirated content as well as hyperlinks to such content. The anti-piracy law has stirred much controversy, for it may cause Wikipedia to be blocked in the country, since Wikipedia has millions of hyperlinks to content that may or may not be authorized. If the legislation comes into force on August 1, Russian Internet users may be denied access to the whole service of Wikipedia. Wikipedia blacked out its Russian-language website in protest of the proposed law.
#imweekly is a regular round-up of news about Internet content controls and activity around the world. To subscribe via RSS, click here.
As California delays public access to prank celebrity 911 phone call records, a court in Maine has kicked things up a notch, pulling from one of over 500 exceptions to Maine's Freedom of
Access Act (“FOAA”) to block public access to a 911 record in connection with an
ongoing criminal trial.
In California, public access to 911 calls has caused a stir for news organizations seeking access to 911 records. The public's enduring fascination with celebrity 911 calls is nothing new, but a few pranksters in California have taken things to another level. California police have recently been having a lot of trouble with "swatting" -- an inside joke among pranksters, who phone in fake emergencies and bring police rushing en masse, sometimes with SWAT team divisions, to celebrity homes. For example, during a swatting incident in January of this year, pranksters sent over half of the Beverly Hills' emergency resources rushing to Tom Cruise's house, only to discover that nothing was apparently amiss at his mansion.
The Los Angeles police's response to swatting seems to highlight law enforcement's readiness and willingness to rush to a celebrity emergency. This has caused some to wonder whether the legal system, beginning with 911 emergency calls, treats celebrities differently than the general public. This concern over differing treatment has prompted some to call for complete transparency of celebrity 911 records, regardless of privacy concerns for the individual celebrities.
However, the Los Angeles Police Department's embarrassment at being the victim of frequent swatting pranks has driven them to refuse to release information to the media about 911 swatting calls without a California Public Records Act request, or even to confirm media inquiries about swatting. The LAPD cites a need to reduce the motivation pranksters receive from the publicity surrounding their pranks. Instead, the media will have to file a public records act request to receive information about swatting incidents, which can take up to 10 days. Some members of the press are up in arms about this restriction, but there is no right to a faster response under California law.
Luckily, California drew back from the brink of limiting access to 911 calls more sharply last year, when an effort to create a new Section 6256 of the California Public Records Act died in committee. This proposal would have placed restrictions on the people to whom public agencies could disclose recordings of 911 calls, limiting disclosure to a select group of requesters: a court, law enforcement, attorneys involved in the proceeding related to the call, or the caller who made the 911 call initially. Missing from the list? The public and the press.
Maine has had a more troubling response to public access to 911 records. In MaineToday Media v. State of Maine, a trial court judge allowed the state to withhold phone records of a 911 call placed by Derrick Thompson, who, prior to being shot by his landlord, had called 911 concerning his landlord's behavior. The court cited the law enforcement investigation exception to Maine's FOAA, which arises under Maine's Criminal History Record Information Act, 16 M.R.S.A. § 614, and permits records containing investigative information to be kept confidential if there is a "reasonable possibility" that public release of the records would "interfere with law enforcement proceedings." Although the court's ruling appears to be redacted in part, the Reporters Committee for Freedom of the Press reports that the trial court allowed withholding of the records under the investigation exception because the information from the records might affect witness testimony at trial.
The plaintiff, a heavyweight in the Maine publishing world that owns the state's largest newspaper, Portland Press Herald, has appealed the decision. MaineToday Media is arguing on appeal that the state failed to show that the information would harm witness testimony. In addition, a number of organizations, including the Reporters Committee for Freedom of the Press, have filed amicus briefs in support of MaineToday Media's appeal. The organizations are challenging the breadth of the judge's ruling, claiming that the decision "runs counter to the open government intent" of FOAA.
FOAA, Maine's Right to Know Law, was enacted in 1959. According to the state government's website, the act promotes the "fundamental principals" of "transparency and open decision-making." Although the original version granted expansive access to government records, in 1975, Maine began to curtail the public's access. Since the 1970's, Maine has narrowly construed what records are available and has created more and more exemptions each year; in March of this year, a number of bills were proposed that would further limit public access to government files. With over 500 exceptions to FOAA (a list that looks like it won't stop growing anytime soon), some may well wonder if Maine has been derailed from its transparency track.
If Maine's Governor is any indicator of the state government's attitude about access, the FOAA frontier looks bleak indeed. Governor Paul LePage has received a level of notoriety in association with his treatment of the press' requests for information. LePage, a three-time Muzzle Award winner has gone as fair as ordering government employees not to talk to specific newspapers, like the Portland Press Herald. In an interview with MSNBC, Cliff Schectman, Executive Editor of the Herald, discussed the frequent challenges the Herald faces when it attempts to obtain information from the state government or obtain press passes to government events. Schectman emphasized that the state government’s concerted effort for a lack of transparency will affect the public in a variety of fields.
The concept of open access to the government's inner workings harkens back to a 17th century ideal of an accessible government by the people, for the people. Although California seems to be merely delaying the process of public access to 911 records, Maine seems to have strayed away from the notion of an open government.
Samantha Scheller is an intern at the Digital Media Law Project and a rising 2L at the University of North Carolina School of Law.
(Image courtesy of Flickr user eflon pursuant to a Creative Commons CC BY 2.0 license.)
Beth Holland and I have a piece up at KQED Mindshift this week as part of our Someday/Monday series about teaching and learning with iPads. The Someday/Monday duality captures a tension in our work as teacher educators: we want teachers to be stretching themselves and imagining ways that technology can help them make powerful changes in their classroom practice, but at the same time we don't want to overwhelm them. We want to also provide advice that they can put to work right away on Monday. This piece is third of a four part series related to teaching with iPads, moving from consumption to curation and now to creation.
Beth and I argue that the iPad shouldn't be a replacement for textbooks and notebooks, but a mobile, flexible multimedia creation device that students use to craft performances of understanding. Here's a snippet:
In the best iPad classrooms, students are constantly making things.
A big part of what they are doing is documenting their learning. At the EdTechTeacher iPad Summit in Atlanta, Jennie Magiera showed a video of a math student working through a problem on a screencasting app, talking aloud, showing and recording his work. I observed a biochemistry lab class at Deerfield Academy where students had iPads, and they used them throughout class to take pictures and video recordings of the lab experiments, which later became key parts of their reports and presentations. In helping students learn to make inferences from poetry, Kristin Ziemke has her first graders draw their mental images from poems that she reads. When I visited the Hillbrook School in northern California, I tried to visit a history class, but I was a few minutes too late. Just after the period started, students in period costumes dispersed across the campus, recording short reenactments.
These rich examples of documentation evoke ideas from Project Zero's Making Learning Visible and Visible Thinking programs. When students and teachers take the time to document their learning and create tangible performances, when they create objects-to-think-with, they deepen their understanding of material, and perhaps more importantly, create tools to spark metacognitive thinking about thinking. Tablets have shortcomings in creating certain kinds of learning objects (the iPad in particular is a very weak platform for learning coding and programming), but with the combination of camera, microphone, touchpad interface, and large viewing surface, tablets are terrific tools for creating a running record of student learning and activity.
You can read the rest of the piece over at KQED MindShift.
For regular updates, follow me on Twitter at @bjfr and for my publications, C.V., and online portfolio, visit EdTechResearcher.
#HGSEPZFOL
- Justin ReichHBR.com has just put up a post of mine about some new guidelines for “paid content.” The guidelines come from the PR and marketing communications company Edelman, which creates and places paid content for its clients. (Please read the disclosure that takes up all of paragraph 4 of my post. Short version: Edelman paid for a day of consulting on the guidelines. And, no, that didn’t include me agreeing to write about the guidelines)
I just read the current issue of Wired (Aug.) and was hit by a particularly good example. This issue has a two-page spread on pp. 34-35 that features an info graphic that is stylistically indistinguishable from another info graphic on p. 55. The fact that the two pager is paid content is flagged only by a small Shell logo in the upper left and the words “Wired promotion” in gray text half the height of the “article’s” subhead. It’s just not enough.
Worse, once you figure out that it’s an ad, you start to react to legitimate articles with suspicion. Is the article on the very next page (p. 36) titled “Nerf aims for girls but hits boys too” also paid content? How about the interview with the stars of the new comedy “The World’s End”? And then there’s the article on p. 46 that seems to be nothing but a plug for coins from Kitco. The only reason to think it’s not an ad in disguise is that it mentions a second coin company, Metallium. That’s pretty subtle metadata. Even so, it crossed my mind that maybe the two companies pitched in to pay for the article.
That’s exactly the sort of thought a journal doesn’t want crossing its readers’ minds. The failure to plainly distinguish paid content from unpaid content can subvert the reader’s trust. While I understand the perilous straits of many publications, if they’re going to accept paid content (and that seems like a done deal), then this month’s Wired gives a good illustration of why it’s in their own interest to mark their paid content clearly, using a standardized set of terms, just as the Edelman guidelines suggest.
(And, yes, I am aware of the irony – at best – that my taking money from Edelman raises just the sort of trust issues that I’m decrying in poorly-marked paid content.)
Few books have shaped my vision of what students need to learn more than The New Division of Labor by Richard Murnane and Frank Levy. Published in 2005, the book details the ways in which computers are transforming labor markets. They now have a new update to that book, a white paper titled Dancing with Robots: Human Skills for Computerized Work.
Levy and Murnane argue that computers do a few things very well, and they do those things very cheaply. In particular, they are extremely good at tasks that can be organized into a set of rules-based routines. Assign luggage tags to the airline passenger. Put the bolt here. Follow the rules of the tax code. Search for every instance of the word "kickback" in 2 million documents. If something can be broken down into a series of if-then-do statements, then sooner rather than later a computer or robot will be doing that task.
Computers, however, are still not very good at certain kinds of tasks, and Levy and Murnane put these into three big categories: solving unstructured problems, working with new information, and carrying out non-routine manual tasks. Unstructured problems are those where the desired end points are unknowable in advance or the set of information needed to solve the problem is unknowable in advance. Broadly speaking, we think of these problems as requiring creativity to solve. Working with new information is another way to discuss communication, which computers are still not very good at. Humans remain better at solving challenges that require social interaction to define the problem space or solicit necessary information. And robots remain laughably bad at some fairly basic manual tasks. For instance, it still can take robots up to 20 minutes to fold a single towel out of a laundry pile.

These observations lie at the core of the idea for 21st century skills. It's not than unstructured problem solving or working with new information are new skills for the 21st century, it's that they are newly important in the 21st century as computers replace routine-based work. In economic terms, humans have a comparative advantage over computers in these domains. In the past three decades, jobs requiring routine manual or routine cognitive skills have disappeared from the labor market, and jobs requiring solving unstructured problems, communication, and non-routine manual work have grown as a proportion of the labor market. The best chance of preparing young people for decent paying jobs in the decades ahead is preparing them with the skills to solve these kinds of complex tasks.
For regular updates, follow me on Twitter at @bjfr and for my publications, C.V., and online portfolio, visit EdTechResearcher. FOL2013
We’re very excited to announce a new partnership between PRX and FRONTLINE. We’ll be working on an iPad app to feature FRONTLINE’s award-winning documentaries. Here’s an excerpt, but head over to the PRX Apps Blog to read more about the project.

It will take advantage of the tablet’s unique form and features, and the unique ways in which people tend to use tablets — in particular, tablet users spend more time with longer-form media. It will keep FRONTLINE’s powerful work front and center, while also providing deeper multimedia info and follow-up to ongoing events. This app will further FRONTLINE’s and PRX’s reputations for bringing broadcast and digital together in innovative and effective ways.