I unbroke my ubuntu

I have a confession to make: IANAP (I am not a programmer) so when my technology breaks I struggle to fix it with a mixture of duct tape and google! Well ok, so no tape. Despite my lack of competence I have made several forrays into the wonderful world of linux, lulled by a mix of political correctness, can-do spirit and a philosophy I believe in. But, none of this helps when technology fails. No amount of feel-good philosophy can help me read my email which is the real reason for me having technology.

So last week when I was practicing with my new toy, a Samsung Notebook with Ubuntu, I came upon a wall of desperation when the menus disappeared. When turned on all I got was a background. Since I had not made any changes to the default it was still the brown boring background – and nothing else.

So I guessed, pushed and prodded the computer but it stubbornly refused to divulge any clues as to how it could be fixed. But never fear, the internet is here! The wonderful post:How to Reset Ubuntu/Gnome Settings to Defaults without Re-installing fixed everything.

Open terminal. Type

mv .config .old_config

hit return, then type

rm -rf .gnome .gnome2 .gconf .gconfd .metacity

restart the computer and it was as good as new. All the menus are back again and I am a happy ubuntu person ready to go out and rebreak my computer – frustration is, after all, a learning experience.

I'm a Gikii

It’s soon time for the Gikii 4 conference which will be held in Amsterdam during 18-19 September – this year it is organized by the Institute for Information Law (IViR). I am particularly happy since I will be attending with a paper of my own.

The program for the conference is here. Just to give you an idea of the type of stuff presented there here are a couple of papers being presented (full list here).

Luddism 2.0, or How I Learned to Stop Worrying and Love the Web

ZombAIs and family law: technology beyond the grave

“Get out of my head, bloodsucker!” Notions of surveillance in the vampire mind

EAT ME

Robot Law?

Future Tech: Governance & Ethics In The Age Of Artificially Enhanced Man (Or ‘Beware The Zombais At The Gate’)

As you can see from this short list Gikii is definately on the bizarre side of technology law.

Should photography lectures be censored?

Photography lecturer Simon Burgess teaches photography at East Surrey College. During the course in Higher National Diploma in Digital Photography he displayed photographs by the controversial photographer Del LaGrace Volcano. Apparently one or more of the second year students were less than impressed and have complained to the college. (British Journal of Photography)

Burgess has been called to a hearing to defend his actions and in the worst case he may be fired. The college told the British Journal of Photography: “Until the facts are raised in a hearing, we cannot comment about staff-related actions.”

It is good that the college wants to know the facts before discussing the problem with the media. BUT. The ability of students to complain about content is becoming strange. Should the lecturer teach what is important for students to learn or should the lecturer limit him/her self to teaching that which does not offend? This would, or should, our ability to teach to a very narrow set of subjects.

Del LaGrace Volcano may be controversial (see quote below) but this cannot in itself be a reason for complaint. It is a dangerous precedent when lecturers are asked to limit themselves to that which is acceptable – for the question is: acceptable to whom? The students are there to be educated, so in theory they should be less knowledgeable. Maybe they need their minds expanded?

As a gender variant visual artist I access ‘technologies of gender’ in order to amplify rather than erase the hermaphroditic traces of my body. I name myself. A gender abolitionist. A part time gender terrorist. An intentional mutation and intersex by design, (as opposed to diagnosis), in order to distinguish my journey from the thousands of intersex individuals who have had their ‘ambiguous’ bodies mutilated and disfigured in a misguided attempt at ‘normalization’. I believe in crossing the line as many times as it takes to build a bridge we can all walk across.

September 2005

Support for Burgess is growing, Dr Eugenie Shinkle, a senior lecturer in photographic theory and criticism at the University of Westminster’s school of media, arts and design writes (The Sauce):

Management are claiming it is pornography, salacious, grotesque, worthless and not relevant to, or appropriate for 2nd year level three photography students preparing for higher study. Apart from being censorious, backward, and homophobic, management’s stance displays a remarkable ignorance of contemporary debates and image-making strategies. This is a serious matter that has implications for all academics, teachers, and students.

I really hope that the college has the backbone to realize what it it there for and to support their lecturer.

Code Rush

The documentary Code Rush from 2000 is about the open-sourcing of the Netscape code base and the beginning of the Mozilla project. Here is a comment from IMDB

Watch this film and you will get to see the things that a college computer science course could never prepare you for: having to sleep at the office for days in order to meet a deadline, alienation from family, caffeine addiction, having one’s release blocked by intellectual property concerns, and other cold realities of Silicon Valley. If you’re thinking about getting a career in software engineering or software project management, Code Rush is a must-see.

This documentary also gives insight into a few of the major milestones in the history of the software industry, such as the opening of the Netscape source code, which is code named “Mozilla”. If it weren’t for this release, we wouldn’t have Mozilla Firefox, one of the most popular Internet browsing solutions today. The footage also covers one of the most notable company acquisitions of that time period.

Code Rush is now released under the Creative Commons Attribution-Noncommercial-Share Alike 3.0 license. There is also a dedicated homepage for the film, with links to stream or download the film in various formats.

Twitter under attack

Twitter is struggling to overcome a denial-of-service attack, they wrote this five hours ago:

We are defending against a denial-of-service attack, and will update status again shortly.

Update: the site is back up, but we are continuing to defend against and recover from this attack.

Update (9:46a): As we recover, users will experience some longer load times and slowness. This includes timeouts to API clients. We’re working to get back to 100% as quickly as we can.

Locational Privacy

The EFF have released a new report on the dangers between locational information and privacy: On Locational Privacy, and How to Avoid Losing it Forever (PDF). The short report by Andrew J. Blumberg and Peter Eckersley takes up issues that need to be taken much more seriously than they have been.

Humpty-dumpty and irreversable systems

While reading a bit of retro work I came across this:

A little known law of life is that of irreversibility. No human or physical act or process can be reversed so that objects and states end up as they were. During the original act and in the time just after it, both object and state undergo change that is irreversible. An early known poem, Humpty-dumpty, recognises this. Once the egg is broken, that is that.

It is the same with systems. They can never be reversed. They can be changed, certainly, and sidetracked, and they can be very easily destroyed, the moment a human-machine information system comes into being, it takes on a life of its own independent of its creators. The operators just run it, while programmers merely maintain it. The process called entropy begins, a confusion that can be measured by the growing gulf between what people first knew about the system and now know about it.

Brian Rothery (1971), The Myth of the Computer, Business Books, p 43.

The end of free

Rupert Murdoch’s media empire News Corp reported a huge financial loss ($3.4bn). Naturally this cannot go un-commented so in today’s Guardian Murdoch is quoted as saying that quality journalism* is not cheap and the era of a free-for-all in online news was over.

So what to do? Well Murdoch’s response is to start charging for online news:

“The digital revolution has opened many new and inexpensive distribution channels but it has not made content free. We intend to charge for all our news websites.”

There may have been a time in history when newspapers could have gone the way of pay-per-view but today the free has spread. One of the reasons for the increasing losses in the print industry is not the traditional web but rather the growth of user-produced content (web2.0). Even if many of these user-producers leech of print media (as does this article since it is a reaction of what I read in the Guardian) it would be very difficult to lock down the news.

The news (whatever that term means) is spread in a number of different sources. Official, unofficial, personal, impersonal, gossip, fact, free, costly etc. But few news sources are so powerful that they can be enclosed and charge money for their content when they once have been provided for free. A pre-internet truth has always been: Any news source can be adequately filled by other news sources. The internet aggravates this by provided a seemingly infinite amount of news sources.

Even though the newspaper business is struggling with their adaption to new technology, charging readers to read their material online will fail. Any attempt by a newspaper to end free will only result in the end of that newspaper. For better or worse – free is here to stay.

* Cannot resist reminding people that “quality journalism” provided by News Corp includes trashy tabloids like The Sun and News of the World as well as quality like The Times and Wall Street Journal.

The future of Wikipedia

It’s almost hard to think of a time when Wikimedia was not the source of all knowledge. After having survived the quality wars (is Wiki as good as printed Encyclopedia), amazing growth, lawsuits & legal threats, internal squabbles and international expansion – today Wikimedia seems as permanent and natural as summer holidays.

But Ed Chi at the Palo Alto Research Center is interviewed by New Scientist:

The number of articles added per month flattened out at 60,000 in 2006 and has since declined by around a third. They also found that the number of edits made every month and the number of active editors both stopped growing the following year, flattening out at around 5.5 million and 750,000 respectively. (read more on this here)

The Wikimedia Foundation has bagun a strategic review of Wikipedia to better understand why the changes to Wikipedia are occurring. Chi argues that a main part of the problem is the growing number of Wikimedia “experts”, in other words people who are experts on Wikipedia. They become a problem since experts on a specific topic are unable to compete with wiki-experts time and expertise in an eventual debate. I have commented earlier on the inclusionists v deletionists issue.

Chi thinks that Wikipedia now includes so much information that some editors have turned from creating new articles to improving existing ones, resulting in more disputes about edits. Such disputes are not a level playing field because established editors sometimes draw on extensive knowledge of Wikipedia’s guidelines to overwhelm opposition in a practice dubbed “wikilawyering”.

In part some of the more devoted editors of wikipedia (wikipedia experts) are becoming more fascinated with wikipedia, as opposed to the content. The whole point of wikipedia should be it’s ability to easily provide information (preferably expert information) but as many discussion pages show – content is not king. Wikilawyering is definately discouraging participation by experts.