Random thoughts
Thursday, July 7, 2011
Kim Wilde, Tisabamokkha and tears
Destination: Phnom Penh
Humming Kim Wilde’s “Cambodia” song, I am getting ready for a special trip. In a few hours, I will leave for my assignment in Phnom Penh under the umbrella of the IBM Corporate Service Corps (CSC) program. #ibmcsc A team of nine people from around the globe, with diverse professional experience, will come together in the capital of the Kingdom of Cambodia (ព្រះរាជាណាចក្រកម្ពុជា in Khmer) for a month and work with local businesses and NGOs on business, technology and society challenges.More than a 1,000 fellow IBMers have participated in CSC assignments globally. We are the first team to visit Cambodia and hope to lay ground for future engagements. After three months of preparation which flew by faster than you can imagine, we are ready for our journey and looking forward to getting together as a team and meeting our clients and the team from Australian Business Volunteers (ABV), the non-government, not-for-profit international development agency which manages the program locally.
Tisabamokkha
While looking for a fancier name than “Team 1” we came across the folktale of Tisabamokkha, a famous teacher in Takkasila, and a great king, who ruled over a rich kingdom and was looking for ways to protect his kingdom and his people.The king, the beautiful queen, their four chief ministers, and the royal astrologer learned magic with Tisabamokkha and were taught the art of turning themselves into all kinds of animals and heavenly beings. When they got lost in the forest of Takkasila on their way home and were starving, they decided to use their magic powers to transform their bodies into a royal tiger: The four chief ministers turned into the four legs of the tiger, the astrologer into the tiger's tail and the queen into the tiger's body. The tiger's head was left for the king himself. The tiger was stronger and more powerful than other animals, and he was so happy with the wonderful new life that he never returned to his kingdom.
What we liked about the story is that it emphasizes the idea that people must cooperate for the common good, and remember their responsibilities to give back to the community. Likewise, working on a CSC assignment is also about cooperating and giving back. The CSC program was announced in 2008 by our CEO Sam Palmisano and aims to provide skills, talent, and capabilities to communities in emerging market countries while helping IBMers gain valuable experience and skills for working in a global environment. Participation is completely voluntary, but once you accept the assignment it does require a fairly significant investment in time and resources for preparation and while in country.
Tears
One of the hardest parts is leaving the family behind for a month, even more so during vacation season when the kids are home. The boys took it easy and quickly returned to playing with their toys after kissing me goodbye; it was yours truly who had the eyes filled with tears. Thank you to my family for allowing me to explore what previous teams described as one of the best experiences that you would have as an IBMer, and thank you to my colleagues and management for the support and encouragement.PS. We will also post updates about our month in Cambodia on our team blog.
Tuesday, May 17, 2011
Web usability: Account required
David D. from Nikon support (why do people no longer have full names?) sent me a message which started with instructions how to respond: “If you have any further questions to our support response, please click the link at the bottom. Hitting reply from your email browser will not reach our group.”
Clicking on the link then brings you to—you probably guessed it—the account login page, and there is no way to respond to the message if you don't have a valid, working account.
I did not get my account restored since I had contacted the wrong Nikon country branch, and rather than forwarding the request to the appropriate support team David sent me the country list to find out how Nikon wants to be contacted.
I have another request into the local support team now, in the same ticketing system, so I really hope that their first response will solve the problem…
Recommendations
- Hide the complexity of your organisation from the customer. Forwarding a request to the appropriate contact within the company is more efficient than having the customer track down the right contact and submit the same information again.
- When sending an e-mail message, be prepared to receive a response by e-mail too. Most online support systems can handle e-mail responses and link the response to the support thread with a unique ID in the subject line.
- Provide an alternate path to address problems with the support website itself. Customers with non-working accounts obviously cannot login to discuss their account problems.
Update: The local support team could easily solve what appears to be a general problem with the online support system that the first team should have been aware about as well.
Monday, May 16, 2011
Taking a break from teaching
In 2004 I had started as adjunct professor in the computer science faculty, teaching Web scripting to a small group of business and IT students, which meant refreshing and formalizing my JavaScript knowledge (closures, anyone?) and learning the basics of VBScript, too. Following that I taught several rounds of design principles and Web animation, starting with the ancient Macromedia Flash and slowly upgrading to Adobe Flash CS3 and CS4.
Sharing what I know and learning what I didn't know has been a great experience. Not only did teaching give me reasons to explore various subjects in greater detail and develop new skills, did I gain a better understanding of the American educational system and students' expectations at a smaller private university campus, and get to attend a graduation ceremony at the Konzerthaus in formal academic regalia. Discussing with students and seeing them going from no knowledge to expert level in just a few weeks, then coming back for the next course and (sometimes) even having fun writing code or building Web animations was also highly rewarding. I am equally pleased to often find some of “my” graduates in good positions at companies and institutions around the world.
Now is a good time to take a break from teaching. Webster University no longer offers the full computer science curriculum in Vienna, resulting in fewer courses which I would like to teach, and balancing this with my other personal and professional activities has become increasingly difficult.
I would like to thank first and foremost my students, you have been a great crowd, and the faculty members and staff at Webster University in Vienna for their support, I have learned a lot from you all.
Thanks, and to my students, good luck with your remaining exams!
Labels: education
Wednesday, April 27, 2011
Bookmarks on the move
AVOS acquires Delicious social bookmarking service from Yahoo!
After months of rumors about Delicious’ future, Yahoo! announced that the popular social bookmarking service has been acquired by Chad Hurley and Steve Chen, the founders of YouTube. Delicious will become part of their new Internet company, AVOS, and will be enhanced to become “even easier and more fun to save, share, and discover” according to AVOS’ FAQ for Delicious.
Delicious became well-known not only for its service but also for the clever domain hack when it was still called del.icio.us (and yes, remembering where to place the dots was hard!) Once called “one of the grandparents of the Web 2.0 movement” Delicious provides a simple user interface, mass editing capabilities and a complete API and doesn’t look old in its eighth year in service.
Let’s hope that the smart folks at AVOS will keep Delicious running smoothly.
PS. Current bookmarks should carry over once you agree to AVOS’ terms of use and privacy statement, keeping a copy of your bookmarks might be a good idea. To export/download bookmarks access https://secure.delicious.com/settings/bookmarks/export and save the bookmark file locally, including tags and notes.
Labels: technology, web2.0
Wednesday, February 9, 2011
Google vs. Bing: A technical solution for fair use of clickstream data
Google decided to create a trap for Bing by returning results for about 100 bogus terms, as Amit Singhal, a Google Fellow who oversees the search engine’s ranking algorithm, explains:
To be clear, the synthetic query had no relationship with the inserted result we chose—the query didn’t appear on the webpage, and there were no links to the webpage with that query phrase. In other words, there was absolutely no reason for any search engine to return that webpage for that synthetic query. You can think of the synthetic queries with inserted results as the search engine equivalent of marked bills in a bank.Running Internet Explorer 8 with the Bing toolbar installed, and the “Suggested Sites” feature of IE8 enabled, Google engineers searched Google for these terms and clicked on the inserted results, and confirmed that a few of these results, including “delhipublicschool40 chdjob”, “hiybbprqag”, “indoswiftjobinproduction”, “jiudgefallon”, “juegosdeben1ogrande”, “mbzrxpgjys” and “ygyuuttuu hjhhiihhhu”, started appearing in Bing a few weeks later:
The experiment showed that Bing uses clickstream data to determine relevant content, a fact that Microsoft’s Harry Shum, Vice President Bing, confirmed:
We use over 1,000 different signals and features in our ranking algorithm. A small piece of that is clickstream data we get from some of our customers, who opt-in to sharing anonymous data as they navigate the web in order to help us improve the experience for all users.These clickstream data include Google search results, more specifically the click-throughs from Google search result pages. Bing considers these for its own results and consequently may show pages which otherwise wouldn’t show in the results at all since they don’t contain the search term, or rank results differently. Relying on a single signal made Bing susceptible to spamming, and algorithms would need to be improved to weed suspicious results out, Shum acknowledged.
As an aside, Google had also experienced in the past how relying too heavily on a few signals allowed individuals to influence the ranking of particular pages for search terms such as “miserable failure”; despite improvements to the ranking algorithm we continue to see successful Google bombs. (John Dozier's book about Google bombing nicely explains how to protect yourself from online defamation.)
The experiment failed to validate if other sources are considered in the clickstream data. Outraged about the findings, Google accused Bing of stealing its data and claimed that “Bing results increasingly look like an incomplete, stale version of Google results—a cheap imitation”.
Whither clickstream data?
Privacy concerns aside—customers installing IE8 and the Bing toolbar, or most other toolbars for that matter, may not fully understand and often not care how their behavior is tracked and shared with vendors—using clickstream data to determine relevant content for search results makes sense. Search engines have long considered click-throughs on their results pages in ranking algorithms, and specialized search engines or site search functions will often expose content that a general purpose search engine crawler hasn’t found yet.Google also collects loads of clickstream data from the Google toolbar and the popular Google Analytics service, but claims that Google does not consider Google Analytics for page ranking.
Using clickstream data from browsers and toolbars to discover additional pages and seeding the crawler with those pages is different from using the referring information to determine relevant results for search terms. Microsoft Research recently published a paper Learning Phrase-Based Spelling Error Models from Clickthrough Data about how to improve the spelling corrections by using click data from “other search engines”. While there is no evidence that the described techniques have been implemented in Bing, “targeting Google deliberately” as Matt Cutts puts it would undoubtedly go beyond fair use of clickstream data.
Google considers the use of clickstream data that contains Google Search URLs plagiarism and doesn't want another search engine to use this data. With Google dominating the search market and handling the vast majority of searches, Bing's inclusion of results from a competitor remains questionable even without targeting, and dropping that signal from the algorithm would be a wise choice.
Should all clickstream data be dropped from the ranking algorithms, or just certain sources? Will the courts decide what constitutes fair use of clickstream data and who “owns” these data, or can we come up with a technical solution?
Robots Exclusion Protocol to the rescue
The Robots Exclusion Protocol provides an effective and scalable mechanism for selecting appropriate sources for resource discovery and ranking. Clickstream data sources and crawlers results have a lot in common. Both provide information about pages for inclusion in the search index, and relevance information in the form of inbound links or referring pages, respectively.Dimension | Crawler | Clickstream |
---|---|---|
Source | Web page | Referring page |
Target | Link | Followed link |
Weight | Link count and equity | Click volume |
Following the Robots Exclusion Protocol, search engines only index Web pages which are not blocked in robots.txt, and not marked non-indexable with a robots meta tag. Applying the protocol to clickstream data, search engines should only consider indexable pages in the ranking algorithms, and limit the use of clickstream data to resource discovery when the referring page cannot be indexed.
Search engines will still be able to use clickstream data from sites which allow access to local search results, for example the site search on amazon.com, whereas Google search results are marked as non-indexable in http://www.google.com/robots.txt and therefore excluded.
Clear disclosure how clickstream data are used and a choice to opt-in or opt-out put Web users in control of their clickstream data. Applying the Robots Exclusion Protocol to clickstream data will further allow Web site owners to control third party use of their URL information.
Labels: bing, google, microsoft, seo, technology, webdevelopment
Monday, January 24, 2011
IBM turns 100
The IBM Centennial Film: 100×100 shows IBM's history of innovation, featuring one hundred people who present the IBM achievement recorded in the year they were born, and bridges into the future with new challenges to build a smarter planet.
Another 30-minute video tells the story behind IBM inventions and innovations.
For more than twenty years I have not just worked for IBM but been a part of IBM. It has been a pleasure, and I certainly look forward to many more to come!
I am an IBMer.
Labels: business, ibm, innovation, technology
Saturday, January 1, 2011
Happy New Year 2011
Friday, December 31, 2010
Missed a birthday this week? Blame Facebook!
A chronologically sorted list of upcoming birthdays comes handy, and Facebook usually provides that. In the last week of the year, however, the sorting doesn't look quite right.
Dear Facebook, January does come before December but only in the same year:
At least you have a good excuse now if you missed a birthday this week.
PS. Have you noticed that Facebook informs you about your friends' birthdays even when they don't share that information on their profiles?
Friday, December 24, 2010
When two men fight for their position in line…
When a truck and a car block each other at the car park and both drivers furiously refuse to back up,
When all the Briochekipferl are sold out as if everyone was going to have a Verhülltes Bauernmädchen for dessert this year,
Then it must be the most peaceful time of the year.
We wish you all the best for the holidays.
Frohe Weihnachten!
Merry Christmas!
Veselé Vianoce!
Joyeux Noël!
Feliz Natal!
کریسمس مبارک
圣诞节快乐!
Labels: personal
Sunday, November 7, 2010
How to fix the “Your computer is not connected to the network” error with Yahoo! Messenger
If you are like me and upgrade software only when there are critical security fixes or you badly need a few feature, you may have tried sticking to an older version of Yahoo! Messenger. I made the mistake of upgrading, and was almost cut off voice service for a few days. Fortunately, Yahoo! has a fix for the problem, which only seems to affect some users.
The new video calling capability in Yahoo! Messenger 10 didn't really draw my attention. Nevertheless I eventually gave in and allowed the automatic upgrade, if only to get rid of the nagging upgrade notice. At first everything seemed fine: The settings were copied over, and the user interface looked reasonably familiar. However, soon after, voice calls started failing with an obscure error message “Your computer is not connected to the network”. Restarting Yahoo! Messenger sometimes helped, but clearly this wasn't working as reliably as the previous version. “Works sometimes” wasn't good enough for me.
Yahoo! support was exceptionally helpful, within minutes the helpdesk agent had identified that I was running Yahoo! Messenger version 10.0.0.1270-us, which was the latest and greatest at the time but a few issues with voice telephony. He recommended uninstalling the current messenger and manually installing a slightly back-level version of Yahoo! Messenger 10, and disallow further automatic upgrades. The installation worked smoothly, and voice support in Yahoo! Messenger has been working flawlessly ever since.
Thank you, Yahoo! support.
Links:
Labels: networking, technology, windows
Wednesday, September 22, 2010
One year later …
One year later I have answers to most of my questions. Getting back into the learning routine wasn’t too hard. The work commute has provided an excellent opportunity for reading text books (although they look somewhat shabby after a few round trips in the backpack). Watching the video streams in the narrow period during which they are made available turned out to be the biggest challenge, with soon-to-expire lectures piling up towards weekends and more than once requiring the family’s understanding and support.
One year later I am happy to report that things have been going well. I took most of the exams offered and passed with reasonable grades, and will soon have completed the first section of the program.
This week I attend the second lecture block at the Institut für Multimediale Linzer Rechtsstudien in Linz to learn about the courses offered in the upcoming years, to pick up more books and DVDs with recorded lectures, and to meet with other students.
Will this change what I do professionally? That question I haven’t answered yet, but it’s quite possible that one day it will. After I finish my courses, that is.
Monday, August 9, 2010
20 years Internet in Austria
On August 10, 1990, Austria became connected to the Internet with a 64 Kbit/s leased line between Vienna University and CERN in Geneva. Having Internet connectivity at one university didn’t mean everything moved to the Internet immediately.
The first online service I had used was CompuServe in 1985 while visiting friends in the UK. Watching the characters and occasional block graphics slowly trickle in over an acoustic coupler at 300 baud transfer rate was exciting (and expensive for my host). Back home, the post and telecom’s BTX service and Mupid decoders promised a colorful world of online content at “high speed”, relatively speaking, but most of the online conversations still happened on FidoNet, which was the easiest way to get connected. “His Master’s Voice” and “Cuckoo’s Nest” were my favorite nodes. At university our VAX terminals in the labs continued to run on DECNet, as did the university administration system. We learned the ISO/OSI reference model and RPC over Ethernet, but no word of TCP/IP. At work my 3279 terminal eventually gave way to an IBM PS/2 with a 3270 network card, and some foresighted folks in our network group Advantis and in IBM Research started putting gateways in place to link the mostly SNA connected mainframes with the Internet. The BITFTP service Melinda Varian provided at Princeton University opened another window to the Internet world (belatedly, Melinda, thank you!)
Meanwhile Tim Berners-Lee and Robert Cailliau made the first proposal for a system they modestly called the World Wide Web in 1989, and further refined it in 1990.
I don’t recall when I got my first Internet e-mail address and access to the Internet gateways after signing agreements that I wouldn’t distribute commercial information over NSFNet and only use the Internet responsibly, but it was only in 1994 when I took notice of the first Website, the now defunct Trojan Room Coffee Machine at the University of Cambridge, and another year before I had my first homepage and my own domain. As many Websites those days would read, “Welcome to the Internet”.
Happy 20th anniversary to the Internet in Austria!
Related links:
- 20 Jahre Internet in at – Die Revolution fraß ihre Kinder (derstandard.at)
- 20 Jahre Internet in Österreich (Futurezone interview with Austrian Internet pioneer Peter Rastl)
- 20 Years of Internet in Austria/20 Years of ACOnet Infrastructure
Labels: austria, technology
Wednesday, July 28, 2010
July 2010 Vienna JavaScript User Group meeting
First, Matti Paksula from the University of Helsinki gave a mini-talk about SVG and JavaScript. Matti pointed out that canvas was unsuitable for shapes, “it’s for bitmaps, it’s not accessible, and it doesn’t scale”. Canvas isn’t all bad though; a combination of HTML 5, JavaScript, canvas and SVG is needed to replace Flash. (That probably means that Flash will be around for a while, despite the lack of support from some devices starting with an “i”.)
Demonstrations included the Canvas to SVG conversions and back as shown at SVG Open 2009, and a sneak preview on the latest version which runs completely client-side. Matti also mentioned the PottisJS SVG prototype library and showed an interactive SVG demo.
Next, Roland Schütz talked about JavaScript code management, specifically how to structure code and source files, implement an efficient workflow and automate the building (and testing) of JavaScript code. Roland mentioned a few nice tools for coding and testing JavaScript source code:
- gema general macro processor for pre-processing source files
- JSLint for code quality and consistency checks (for quick tests the online version of JavaScript Lint is quite useful, too)
- phpcpd to detect duplicate code
- Selenium for Web application testing
Finally, Lars Dieckow delivered an impromptu talk entitled “Sommerloch” about–Perl :-). More than fifteen years after the release of Perl 5.000, Perl 6 is just around the corner and the Rakudo Star release will be available from the usual sources starting tomorrow.
As a long time Perl programmer–the first Perl programs I touched were Perl 4 code and I am pretty sure there are some &function calls around still in code we use today–I hadn’t closely followed the development of Perl 6, and it was good to get an update on enhancements and changes in Perl 6 and a live demo of some of the new features after the talk.
Labels: events, javascript, perl, webdevelopment
Thursday, June 24, 2010
Human rights 2.0
Krone focused on freedom of the media, freedom of speech and data privacy in the European Union, pointing out that the Internet itself is not a mass medium but merely a communication channel that carries, amongst other things, media products: Individuals often gather information about others purely to satisfy their curiosity, and conversely share their personal information seeking for recognition. Companies mainly satisfy their business needs and sometimes manage to create “sect-like islands on the net like Apple does”, but generally lack the sensibility and awareness for data privacy needs. States need to balance the need for security and state intervention with the freedom of the people and basic rights.
In the following discussion, Krone suggested the Internet would eventually become fragmented along cultural or ideological borders, and Europe would have to build a European firewall similar to the Great Firewall in China (which uses technology from European IT and telecom suppliers). The audience strongly objected to the notion of a digital Schengen border, which goes against the liberal tradition in many European countries and doesn’t recognize the range of believes and the diversity within Europe.
Benedek talked about Internet governance and the role of the Internet Governance Forum (IGF), a “forum for multi-stakeholder policy dialogue”. Concepts for dealing with illegal activities and what is considered acceptable and appropriate encroachment upon basic rights such as those guaranteed by the European Convention on Human Rights (ECHR) vary between countries. Even more, what is illegal in one country may be perfectly legal and even socially accepted behavior elsewhere.
Touching on net neutrality and the digital divide, he mentioned that there is a push to make Internet access a human right and some countries have indeed added rights to participate in the information society to their constitutions. At the same time the copyright industry focuses on the three strikes model in the Anti-Counterfeiting Trade Agreement (ACTA) model as punishment for intellectual property violations.
ACTA is not the only threat to access for all though: Much content today is only available to people who understand English, and not all content is suitable for children or accessible to elderly people. How we can make the net accessible to people of all ages and qualifications, and in their native languages, remains a challenge.
Basic human rights, including the rights to education, freedom of speech and freedom of press, increasingly have a material dependency on the right to Internet access. As an audience member pointed out, “offline” studying at university is virtually impossible; long gone are the days of paper handouts and blackboard announcements.
Both speakers agreed that the right to privacy requires “educated decisions” by the people, and consequently educating people. The lectures and the following lively discussion last night served that purpose well.
Related links:
- Announcement “Menschenrechte 2.0 – Menschenrechte in unserer Informationsgesellschaft“
- Fonds zur Förderung der wissenschaftlichen Forschung (FWF)
- PR&D Kommunikationsdiensleistungen GmbH
Labels: education, events, privacy, society, technology
Monday, May 31, 2010
Blogger on your site
If you are one of the .5% of bloggers who for whatever reason published via FTP or the more secure SFTP, you were left with a choice of moving your blog to blogspot.com or a custom domain name, or moving to another blogging platform. Importing your blog into WordPress is easy, WordPress has some nifty features that Blogger lacks, and you will easily find professionally designed WordPress themes, too, but switching to WordPress means going with the hosted solution on wordpress.com or installing and maintaining WordPress code on your server.
For those who want to stay with Blogger and have Blogger integrated into the Website there are two options, both requiring some hacking and configuration:
- Use the Blogger Data API to retrieve the blog in XML format and perform the rendering locally, most likely by processing the XML with XSLT stylesheets. While very flexible, this means losing the Blogger template capabilities.
- Build a reverse proxy that translates requests for blog resources to the correponding URL on Google's servers. The proxy solution gives flexbility with URL formats and also allows for tweaking the generated HTML code before sending it to the browser.
The Blogger proxy solution
Here is how it works:- Create backup copies of your blog in Blogger and on your server. The migration tool will update all previously published pages with a notice that your blog has moved, so you want to save the state of your blog first.
- Create a secret hostname for your blog in a domain you control, say secretname.example.com, and CNAME this to ghs.google.com. Don't limit your creativity, although the name really doesn't matter much. The migration tool checks that secretname.example.com is CNAMEd to ghs.google.com during the migration.
- Use the Blogger migration tool to move your blog to the new domain. At this point the blog will be up and running at secretname.example.com.
- Install a proxy script on your site which intercepts requests, rewrites the request as needed and sets a Host: secretname.example.com header, sends the modified request to ghs.google.com and rewrites the response to correct absolute links, and optionally tweaks the generated HTML code before sending the response to the browser.
- Configure the Webserver to invoke the script when no local content is available, for example in Apache
RewriteEngine On
RewriteRule ^$ index.html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /bloggerproxy.php [L] - Google will eventually attempt to index your blog under secretname.example.com. To ensure a consistent appearance of the blog on your site, as the last step point secretname.example.com back to your Webserver and forward requests with that server name to your proxied blog using a 301 redirect.
Disclaimer: This solution is not for the faint of heart. It requires more changes and configuration than simply switching to a custom domain name, and isn't blessed or supported by Google. Use at your own risk.
Labels: google, web2.0, webdevelopment
Thursday, March 25, 2010
Perl foreach loop and dynamic scoping
my $PROTOCOL = 'http';
for $PROTOCOL (qw(http https)) {
dosomething();
}
sub dosomething {
print "$PROTOCOL\n";
}
Even with use strict and warnings turned on, the script runs without warnings, but rather than printing http and https in sequence, it prints http twice!
As it turns out, after much debugging (the sample code is stripped down from a larger script which actually does something useful) and collectively scratching heads, this is indeed the documented behavior:
The foreach loop defaults to scoping its index variable dynamically in the manner of local. However, if the index variable is prefixed with the keyword my, or if there is already a lexical by that name in scope, then a new lexical is created instead.
Lessons learned:
- After a decade of hacking Perl code, there's always something new to learn (and use strict doesn't stop the programmer from getting the scoping wrong).
- Reading the documentation (sometimes) helps.
Labels: perl, technology
Friday, January 1, 2010
Happy New Year 2010
We wish everyone a Happy New Year 2010!
As we started the new year, we watched not only the fireworks but also the “blue moon”, as the rare occurrence of two full moons in one month is called. albeit incorrectly as Mayo noted. The next time we will see this astronomical phenomenon on a New Year's Eve will be in 2028. Unfortunately the weather conditions weren't too good, lots of clouds throughout the day and now dense fog.
2009 in retrospect
In January, organizational changes and painful staff reductions at work marked an unfortunate start of the year 2009. On a more positive note, the painter was almost done with the house, and we had a surprise party to celebrate Andrea’s birthday–the first in a series of round birthdays amongst friends and family–with rented tables and chairs in an otherwise empty house. With the beginning of the spring term, I also resumed teaching at Webster University in Vienna with courses in Web animation (yes, that’s mostly Flash development) and Web design principles.February and March were mostly filled with going to furniture stores, chasing Kika to deliver the missing kitchen cupboard (and annoying everyone with frequent updates on the incorrect delivery, partial delivery, non-delivery) and running errands around the house.
On April 2 and 4. we moved to the new house with great help from Christian, Michael and Rainer. We were exhausted after hauling boxes and furniture. There was plenty of stuff left in the old apartments waiting to be moved in the months to come, and we were missing some furniture (did I mention a kitchen cupboard?) but immediately enjoyed living in our new home.
In May, Elias and Daniel welcomed their new cousin and gently put their arms around the newborn baby, very cute.
Our first real holiday trip with the kids in July started with a damaged engine and the desparate search for a replacement vehicle. Other than the rough start, we had a good time, enjoyed Italian cuisine, the sea and the nice sand beaches in Lido di Jesolo.
In August, Andrea and the kids spent some time with the grandparents in Salzburg, so I had the house to myself, which meant more visits to the hardware store and eventually getting all shelves in the basement properly secured.
With a week-long block course at the picturesque Burg Schlaining I started my legal studies at the Johannes Kepler Universität Linz in September. The degree program is suitable for distance learning and will take at least four years.
After a busy fourth quarter we were looking forward to the holiday season. This year for the first time we celebrated Christmas at our place instead of visiting family. Our plan was that Elias and Daniel would attend church in the afternoon with the grandparents while we decorated the tree on Christmas Eve; they were both so exhausted from the excitement about the holidays that they fell asleep instead. Judging from the expressions on their faces when they saw the tree, we did a good job :-)
Labels: personal
Tuesday, September 29, 2009
Internet Summit Austria 2009
ISPA chairman Andreas Koman opened the session with statistics about Internet use in Austria and an overview of current developments and challenges.
Claudia Bandion-Ortner, minister of justice, admitted her preference for paper files and reminded the audience that the Internet is not an area unregulated by law. There are legal issues specific to information technology, such as data theft and violation of data privacy rights. While fraudsters and other criminals use the Internet, most crimes are media neutral. One area that is closely linked to the Internet, though, is child pornography. Bandion-Ortner referred to the controversial German pilot for blocking access to illegal sites. Needless to say, the same filter technology could be used for censoring access to legitimate information or enforcing intellectual property rights.
Volker Grassmuck delivered a keynote about the reformation of intellectual property law in the digital age. Established “common sense” can block creativity and innovation. Some ideas worked well although most people would have assumed they wouldn’t:
- Shared space pioneered by Hans Moderman–“If you treat people like idiots, they will behave like idiots.”
- Shared code with the Free Software Foundation (FSF)
- Shared profits with the micro-payments of the Grameen bank– “People behave in a trustworthy way when they are trusted.”
On net neutrality Grassmuck mentioned a speech by FCC chairman Julius Genachowski and a refined view on the issue, with net neutrality but with network management to handle congestion or spam and with provisions for law enforcement, and transparency which would allow blocking or throttling certain types of traffic as long as customers are made aware.
There is no one solution that satisfies the needs of content producers, consumers and intermediaries. Working models will require a combination of an agreement between creative professionals and society, markets, free licenses, public subsidies and a “cultural flat rate”.
One of the conference gifts was, ironically, a USB stick with a locked down installation of Firefox using the Tor network to ensure privacy.
The keynote was followed by a lively discussion about intellectual property rights, including but not limited to compensation for the creator of content. The composer Johanna Doderer and the author Gerhard Ruiss pointed out that they want to maintain control over what happens with their works and reminded the audience that creative professionals are typically paid by how often their works sell. Georg Hitzenberger of Play.fm and Bettina Kann of the Austrian National Library outlined some of the challenges with obtaining rights for use in digital media and making content available. For example, the digital Web archive maintained by the Austrian National Library has unreasonably strict access requirements in selected locations only, one person at a time. Franz Schmidbauer touched on legal aspects and the adequacy of intellectual property rights enforcement.
MEP Eva Lichtenberger made an interesting comment about giving young people the ability to purchase digital media without requiring a credit card, quoting the large amounts spent on ringtones where suitable payment solutions are offered by telecom providers.
After the lunch break, Peter A. Gloor gave an entertaining presentation about “Coolhunting by Swarm Creativity” (that’s a lot of buzzwords for a title), explaining how their system combines different inputs–the wisdom of the crowd in the form of the Web, the wisdom of the swarms in the dynamics of fora and blogs, the knowledge in news and Wikipedia–to understand networks, trends and content. “Experts are right – in 50% of the cases. You never know which 50% you have.” swarmcreativity.net and ickn.org have good information about the concepts and the Condor software for non-commercial use.
Two panel discussions about social networks and business on the Internet concluded the agenda.
Labels: austria, events, technology, web2.0
Friday, September 25, 2009
Brain food
The program started in early September with one week of lectures at the Burg Schlaining conference center. Besides getting an introduction into the concepts of law, registering and picking up the text books and DVDs, we had sufficient time for socializing and getting to know fellow students. On the first evening, the major invited to a reception at the town hall. I appreciated the warm welcome and thoroughly enjoyed the week, learning something new in the relaxed atmosphere of scenic Stadtschlaining and meeting nice people.
Having spent a good portion of my study period with checking bulletin boards distributed throughout the campus for announcements and organizing the study—the Web had not been invented—I was positively impressed how well things were organized here. The lectures and practice sessions are available on DVDs or as video streams, and textbooks are available for all courses, eliminating the need to take illegible notes.
What's nice about the program is the flexibility where and when you study. Of course that flexibility comes with the risk of procrastination, so feel free to ask me about my progress from time to time (read: no more than once a quarter!) as a gentle reminder.
Will I become a lawyer some day? Probably not. I haven't given much thought to how a law degree might change my career plans. Either way, it will be worth it to me.
Tuesday, August 11, 2009
Security, privacy, and an inconvenience
The Central Intelligence Agency (CIA) in 2006 began serving its Website encrypted in an effort to improve security and privacy of the communication.
This is a clear case for a 301 redirect from the unencrypted URL http://www.cia.gov/page to the equivalent encrypted URL https://www.cia.gov/page. Instead, except for the homepage and very few other pages, all requests get redirected to a splash page informing visitors about the site changes:
Not only is this a bad idea for search since all those links out there on various sites now transfer link weight to a splash page which is marked as non-indexable. It is also an inconvenience to users who need to navigate to the specific content or go back to the previous page and try again with an edited link.
Even the old URL for the World Factbook, arguably one of the most popular resources on the site, no longer goes to the desired World Factbook homepage directly.
The CIA press release states: “We believe the inconveniences of implementing SSL for the entire website will be offset by increased visitor confidence that they are, in fact, connected to the CIA website and that their visits are secure and confidential.”
The effort to increased security and privacy is commendable, and encrypting all communication with the agency certainly isn't a bad idea. Doing so without the inconveniences would be even better though, and perfectly feasible, too.
Labels: networking, seo, technology