Much ado about scripting, Linux & Eclipse: card subject to change

2009-05-28

We404

Eclipse Community Survey: 4 More Insights

Ian blogged 6 insights from this year's Eclipse Community Survey; here are a few more to get us to a full Top Ten list.

  1. What is your primary operating system?

    Linux is certainly a strong player in both development (26%) and deployment (40%), beating Mac (7% and 3%) but losing to Windows (64% and 38%). More interesting to me is the fragmentation of Linux, showing that Ubuntu beats RHEL/Fedora by 10% in the desktop space (development), but loses in the server space (deployment).

  2. Where do you typically go to find Eclipse-related information?

    About 2/3rds said Google and/or the Eclipse home page, which suggests that the homepage has certainly improved - but a lot of people would rather just search. However, the survey didn't mention our finely crafted wiki.eclipse.org, or help.eclipse.org. Survey #FAIL.

  3. Are you or the organization you work for a member of the Eclipse Foundation?

    Five out of six respondents (83%) said No. So either we've done a terrible job of converting users into members, or people would rather give back in the form of testing, documentation, filing bugs, and writing articles. I suspect it's a little of both, but mostly the former.

    Kudos to the contributors, and shame on the corporate drones for not convincing their queen to send a little honey back to Eclipse.

  4. In the last year, how have you participated in the Eclipse community?

    While nearly a quarter of respondents (24%) said "I entered at least one bug into Bugzilla", more than 2/3rds said they "used Eclipse but didn't actively participate in the community." To me that's a clear sign we have more users than contributors. Is that because most Eclipse users are Windows folks who don't grok that Open Source works best when everyone sees themselves as part of the process, rather than just a consumer?

I've been reading More Joel On Software recently, thanks to winning a prize for bringing a bag purchased in Alaska to EclipseCon this past March. One article stands out from there in this context, Building Communities with Software, from March 2003. Here's an excerpt:

The social scientist Ray Oldenburg talks about how humans need a third place, besides work and home, to meet with friends, have a beer, discuss the events of the day, and enjoy some human interaction. Coffee shops, bars, hair salons, beer gardens, pool halls, clubs, and other hangouts are as vital as factories, schools and apartments ["The Great Good Place", 1989]. But capitalist society has been eroding those third places, and society is left impoverished.

...

So it's no surprise that so many programmers, desperate for a little human contact, flock to online communities - chat rooms, discussion forums, open source projects, and Ultima Online. In creating community software, we are, to some extent, trying to create a third place.

If you feel your third place is lacking, please consider contributing more to Eclipse, to Fedora or CentOS, to JBoss Tools, or whatever tickles your fancy. Just give something back. Your community will thank you, since, after all, "A rising tide lifts all boats."

UPDATE, 2009/05/30: Mike's right, calling our users "freeloaders" isn't fair. I just wish there was a more obvious way to convert users into contributors.

2009-05-26

Dash Athena: Eclipse Common Build System / Running Tests On Your System

Bjorn recently kvetched that Eclipse projects met two or three of those goals, but fell down on the "common build system" and "tests run on your system" [1].

While it's true I've seen a number of projects who don't have, don't run, or don't publish their tests, I'm a little disappointed to see Bjorn's no longer committed to the common build solution we've been working on since September 2006 (in earnest since June 2008). We do have a project to solve both those concerns, but like all things at Eclipse, it's powered by YOU. You want it to happen, you have to help. I'm looking for a few good contributors and committers for the Dash Athena project to supplement the great people we already have. Or, if you don't have time to contribute code, you can help by using the system, testing it, opening bugs, enhancing documentation, and blogging about it.

So, what is Dash Athena?

Well, it's a common build system using Hudson and PDE which can also be run commandline on Linux, Windows or Mac, or in Eclipse. It can produce zips of plugins, features, examples, tests, then run those tests. It can also produce update sites with p2 metadata, which can then be published to eclipse.org (or sourceforge.net, for that matter) so everyone can get your bits via Update.

Tests will currently only run on Linux - if you'd like to help us port to Mac OS X and Windows, please step up. The system works with CVS, SVN, and probably Git/Bzr/Hg too, since it supports building from locally checked-out sources and will copy your features/plugins so they're in the format that PDE requires. It supports source input via map files (soon Project Set Files (*.psf), too!) and binary inputs via zips and p2 repos / update sites.

If you aren't sure how to get started w/ an Athena build, please don't hesitate to ask. If you feel the docs are insufficient, incomplete, or inaccurate, let me know - or better - fix them! Want your own Hudson job to run your build? Just open a bug and we'll set you up.

Oh, and incidentally, the irony is not lost on me that I'm using American iconography above even though 5 of the 6 committers on the project are Canucks. :)

2009-05-25

They're Coming To Make Me Write ASP!


watch video

Remember when you ran away
Big Blue got on their knees
And begged you not to leave
PDE'd go berserk

You left 'em anyhow
And then the days got worse and worse
And now I see you've gone
(Completely out of your mind)

And they're coming to take you away ha-haaa
They're coming to take you away ho ho hee hee ha haaa
To the Redmond farm
Where life is beautiful all the time
And you'll be happy to see those nice young men
In their - see? Sharp coats
And they're coming to take you away ha haaa

We thought it was a joke
And so we laughed
We laughed when you had said
That you could leave the FLOSS and work for Bill

Right? You know we laughed
You heard us laugh. We laughed
We laughed and laughed but still you left
But now we know you're utterly mad

And they're coming to take you away ha haaa
They're coming to take you away ho ho hee hee ha haaa
To the happy home with bugs and Vista and viruses
Security "fixes" which patch and patch and open new hacks and holes
And they're coming to take you away ha haaa

We've read your blogs
And used your code
And this is how you pay us back
For all our kind unselfish, loving deeds?
Ha! Well you just wait
They'll find you yet and when they do
They'll make you write with ASP.net
You well-dressed geek

And they're coming to take you away ha haaa
They're coming to take you away ha haaa ho ho hee hee
To Camp Microserf where life is beautiful all the time
And you'll be happy to drink that nice Kool-Aid
In their clean white cups
And they're coming to take you away

Neuroticfish - They're Coming To Take Me Away

2009-05-22

Use Your Metadata, Vol. 2 [Update]

Wednesday I went off on a bit of a G'n'R-fueled rant about metadata, documentation, and the shotgun blues. Today, I'd like to focus on something more positive.

As Pascal blogged the other day, the new p2 is almost done and is ready for tire-kicking. Some new features I personally like include:

  1. a new p2.director app / task, which includes support for installing multiple IUs (feature.groups) in the same step and finally has commandline help
  2. a new p2.repo2runnable ant task, used to convert an update site zip to the old-school unpacked "runnable" features/plugins format so that one day we will be able to throw away all those extra zips.

    UPDATE, 2009-06-02: repo2runnable now works as a commandline application too, thanks to Andrew's fix. Wiki updated.
  3. Composite Repo, Mirroring and Slicing Tasks - haven't tried these yet, but they look like they'll be very handy for one day replacing the hack that is buildUpdateSite.sh for our Modeling Project composite repos with something more robust and easily maintainable.

I'm also impressed that there is new, current documentation regarding the above tasks, as well on the new Publisher which replaces the Metadata Generator.

Will this release be p2's salvation?


click to zoom

Well, I'm split on the new default behaviour in the update UI, such that when you add a new update site p2 won't by default search ALL your other listed sites. This is a great performance gain if you're installing a new self-contained feature, but a pain if you're installing something like VE which depends on EMF and GEF, and you don't already have those deps installed. Simple workaround is to just pick the "all sites" entry in the dropdown.

I'm also waiting to see if there will be something better done about recovery from slow/incomplete mirrors.

But other than these minor concerns, I'd say YES. With lots more commandline and ant toys available, p2 is certainly maturing. And with more people adopting its use and spinning p2 repo zips, more testing is being done, and more use cases are being covered.

So... get in the ring, and go a few rounds with p2. It's worth the battle. :)

2009-05-20

Use Your Metadata, Vol. 1

It's been a bad week for update sites and Galileo contribution from Modeling... and I confess I'm partly to blame. That and the fact that despite documenting processes, workarounds, tips, tricks, and advice... no one Reads The Fine Mediawiki (Category:Releng or Modeling Project Releng).

Highlights:

  1. The mysterious appearance of a new version of org.eclipse.osgi_*.jar in releng.basebuilder's R35_M5 tag, which caused an ant <copy/> used to rename a file to fail because copy can't merge two jars into one file. Still no idea why an old basebuilder tag would magically grow a new jars, but I've worked around the now-faulty assumption w/ smarter Ant code.
  2. A change to the way our sites are created, in an attempt to workaround what I believe (but can't yet prove) is a flaw in the way content.xml is produced - namely, if the xml file is > 21M, it gets truncated or corrupted. We used to cache 2 or 3 releases of a given project (eg., M5 and M6) on the same site, in order to give people a way to "back up" to the previous release; now, you only get the latest (bug 271486). I confess I screwed up here and instead of replacing a folder with new contents, I was copying INTO that folder - `cp -r one two` resulted in folder one/two/ instead of two/. I fixed that by changing to a move instead of a copy, but a downstream process failed because of the assumption that both one/ and two/ would exist. The lessons here are: a) shotgun debugging sucks, and b) don't change the way stuff is created after M7.
  3. People publishing two updates to a site at the same time, resulting in the appearance of two </site> tags in a site.xml file, causing p2 metadata generation to be incomplete or fail entirely; unfortunately, no error is logged when this happens so it's rather difficult to decipher the tea leaves. This may be the real source of the metadata corruption, if not the "file is too big" issue above.
  4. Observations about obsolete jars corrupting metadata, but no one taking it upon themselves to clean up the site or do some troubleshooting
  5. People inconsistently naming their milestones (it's 2.0.0M7, not just M7!) and corrupting our Release Notes database. This one amazes me the most since it takes seconds to see what was done last time (check any of the following: RSS feeds, release notes, downloads pages, update sites) and follow suit. And, of course, the conventions are documented, along with the rationale (consistent patterns == simpler code).

Or, to put it another way...

Sick of this life
Not that you'd care
I'm not the only one with
whom these feelings I share

Nobody understands
Quite why we're here
We're searchin' for answers
That never appear

But maybe if I looked real hard I'd
I'd see your tryin' too
To understand this life,
That we're all goin' through

Sometimes I feel like I'm beatin' a dead horse
And I don't know why you'd be bringin' me down
I'd like to think that your love's
Worth a tad more
It may sound funny but you'd think by now
I'd be smilin'
I guess some things never change
Never change

So, please, can we stop opening bugs (277172, 277105, 277034, 276928, 276641) and just use the tools and docs already available?

2009-05-16

HOWTO: AVI to DVD Conversion

Yesterday I set about learning how to convert a Xvid-encoded .avi to DVD so that my more technically challenged relatives can watch an HBO movie that's not available in video stores.

Here's the process, with approximate elapsed times. Thanks go entirely to linuxquestions.org for the solution.

Hardware used: Thinkpad R51 (Pentium M 1.6GHz, 1.2G RAM) + Samsung USB DVD-DL burner (16.4x1352KBps).

  • Acquire your .avi file source. For a 1.1G .torrent, this took 2 hours.
  • Copy the source onto a drive with sufficient space to re-encode it. 20 - 25 minutes for USB-to-USB copy between drives.
  • Verify installed software requirements. I still have an old copy of xubuntu on the R51, so I needed to install these tools. Under 2 minutes.
    apt-get install dvdauthor dvd+rw-tools \
      transcode mplayer ffmpeg mjpegtools xine
    
  • Create a dvdauthor.xml file. Under 2 minutes.
  • Check for 5.1 audio; if the following yields a result, run tcextract; otherwise skip. For my video, skipped. Seconds.
    mplayer -vo dummy -identify movie.avi
    tcextract -d2 -i movie.avi -a0 -x ac3 | tcextract -d2 \
      -x ac3 -t raw > movie.ac3
  • Split the 16:9 NTSC widescreen .avi into movie.m2v (video) and movie.ac3 (audio). Create an MPEG from the audio and video pieces. Generate DVD/AUDIO_TS/ and DVD/VIDEO_TS/ dirs in the current dir using the dvdauthor.xml to add chapters every 15 minutes. All three steps back-to-back, 6 hours.
    transcode -i movie.avi -y ffmpeg --export_prof dvd-ntsc \
      --export_asr 3 -o movie -D0 -s2 -m movie.ac3 -J \
      modfps=clonetype=3 --export_fps 29.97; \
      mplex -f 8 -o movie.mpg movie.m2v movie.ac3; \
      dvdauthor -x dvdauthor.xml
  • Verify the video and audio will play. Seconds.
    xine dvd:/full/path/to/DVD/VIDEO_TS/
  • Burn the DVD. 90 minutes for a 2.8G DVD image USB-to-USB burn.
    growisofs -Z /dev/dvd1 -dvd-video DVD/

Total time to acquire a 1 hour 43 minute DVD: just under 10 hours. Your mileage may vary.

2009-05-15

The Year of the Linux Desktop... some day.

It's been said for many a year that *this* will be the year Linux breaks through into the desktop space. Clearly we're still a long way off, but it's nice to see that over the past two years, both Linux and BSD-based Mac OS X have taken share away from Redmond.

Of particular note, Linux has finally hit 1% of the desktop market. Just 99% more to go!

2009-05-12

Work it, Baby, Work it!

Next Thursday, May 21 is the Toronto Demo Camp from 6:30-8:30 with refreshments thereafter. We haven't nailed down all the details yet (like the order of presentations or what food will be served), but I can safely say that barring any Canada Post mixups, there will be Eclipse golf shirts available to be won by those in attendance.

How do you win? Well, by presenting, of course! Sign up now!

The presentation format is flexible, and this is an informal event... so if you have 5 mins, 15 mins, or 30 mins of material, great! We'll provide the conference room, projector, whiteboards, and even a laptop if you provide the demo materials. The rest is up to you.

When signing up, please identify your name, topic & expected duration. Looking to strut your funky stuff and get some early adopters as you get close to the June release? Here's your chance.

2009-05-11

Automatic Eclipse mirror selection / Better download pages?

Have you ever wanted to fetch a whole stack of Eclipse project runtimes so you can build against them? For example, say you want all the Galileo M7 builds from Platform, TPTP, BIRT, DTP, WTP, and dependencies (EMF, GEF, XSD). You can find the URLs for each zip you want on the projects' pages, and download them one-by-one from the closest mirror, but that's time- and bandwidth-consuming, esp. if you want these on a remote server, not your local box.

Enter the "&r=1" option on http://www.eclipse.org/downloads/download.php, which will fetch from the closest mirror automatically.

So, now, you can script the M7 stack fetch like this:

for u in $*; do
  if [[ ! ${u##*file=*} ]]; then # add the r=1 suffix
    u=${u}"&r=1"
  fi
  echo "wget $u ..."
  wget --no-clobber "$u"
done

Then run it like this:

./fetch.sh \
"http://www.eclipse.org/downloads/download.php?file=/tools/gef/downloads/drops/3.5.0/S200905011522/GEF-runtime-3.5.0M7.zip" \
"http://www.eclipse.org/downloads/download.php?file=/birt/downloads/drops/M-R1-2.5M7-200905061338/birt-report-framework-2.5M7.zip" \
"http://www.eclipse.org/downloads/download.php?file=/birt/downloads/drops/M-R1-2.5M7-200905061338/birt-wtp-integration-sdk-2.5M7.zip" \
"http://www.eclipse.org/downloads/download.php?file=/datatools/downloads/drops/N_DTP_1.7/dtp-1.7.0M7-200905052200.zip" \
"http://www.eclipse.org/downloads/download.php?file=/eclipse/downloads/drops/S-3.5M7-200904302300/eclipse-SDK-3.5M7-win32.zip" \
"http://www.eclipse.org/downloads/download.php?file=/eclipse/downloads/drops/S-3.5M7-200904302300/eclipse-SDK-3.5M7-linux-gtk.tar.gz" \
"http://www.eclipse.org/downloads/download.php?file=/eclipse/downloads/drops/S-3.5M7-200904302300/eclipse-SDK-3.5M7-linux-gtk-x86_64.tar.gz" \
"http://www.eclipse.org/downloads/download.php?file=/eclipse/downloads/drops/S-3.5M7-200904302300/eclipse-SDK-3.5M7-macosx-carbon.tar.gz" \
"http://www.eclipse.org/downloads/download.php?file=/modeling/emf/emf/downloads/drops/2.5.0/S200905041408/emf-runtime-2.5.0M7.zip" \
"http://www.eclipse.org/downloads/download.php?file=/tptp/4.6.0/TPTP-4.6.0M7-200904260100/tptp.runtime-TPTP-4.6.0M7.zip" \
"http://www.eclipse.org/downloads/download.php?file=/webtools/downloads/drops/R3.1/S-3.1M7-20090505073946/wtp-S-3.1M7-20090505073946.zip" \
"http://www.eclipse.org/downloads/download.php?file=/webtools/downloads/drops/R3.1/S-3.1M7-20090505073946/wtp-jpt-S-3.1M7-20090505073946.zip" \
"http://www.eclipse.org/downloads/download.php?file=/modeling/emf/emf/downloads/drops/2.5.0/S200905041408/xsd-runtime-2.5.0M7.zip" 

So, now the only problem is that every project structures & styles their downloads pages differently...

  1. DTP -> choose file(s)
  2. GEF -> choose file(s)
  3. EMF/XSD -> choose file(s)
  4. WTP -> choose build -> choose file(s)
  5. Platform -> choose build -> choose file(s) & platform(s) -> click through warnings
  6. TPTP -> choose branch tab -> choose build -> choose file(s)
  7. BIRT -> More Downloads -> Recent Builds -> choose build -> choose file(s)

Am I the only person that finds this inconsistency annoying? Is it time for a more consistent UI? I'm exploring what to do for Athena-based builds, and welcome suggestions in bug 275682. What pages do you like best? Worst? Which are easiest to use? Hardest? Do you prefer the old blue-and-white pages? The purple Phoenix pages? The grey Nova pages? Any UI designers want to contribute?

Or, really, are downloads obsolete? If we could collect statistics on p2 jar downloads, I'm sure we'd see that most people prefer that approach, and I for one would certainly prefer to just build against a p2 repo (or 7) than a pile of zips. I suppose the hybrid solution for now is to provide zipped p2 repos for download, many projects do today (Modeling, GEF, PDT, VE ...).

2009-05-08

No More Blue Balls

Just a quick note to let people know that I've updated the Eclipse Hudson instance (and you can too, if you're in the Hudson admin group!)

As part of this update, I've installed a plugin to make Hudson look more consistent with other CI tools:

I know it's generally not advisable to do potentially breaking changes like this on a Friday evening, but after a number of ups and downs this week, the Galileo BuckyBuilder is actually green (blue) ...

... and it's my last weekend as a 25 year old, so out with a bang we go.

If I broke anyone's Hudson job w/ this update, let me know - I'll be checking mail over the weekend, just in case.

2009-05-07

A Week Without Firefox

Last week I fired the Fox and switched to using Opera 9.6. Today, I'm back to Firefox 3.0 because while Opera has a few nice features, it ultimately lags behind FF (for me, anyway) in usability and functionality.

Here's how they stack up:

Opera 9.6's Pros

  1. Sidebar notepad feature
  2. Speed dial homepage
  3. Minimalist UI with sidebar (incl. a handy notepad app and the usual suspects (transfers/downloads, history, bookmarks). For web dev, there's some handy extras like Links (a list of all the links in a page) & Info (page metadata)... but then FF also provides these via a different UI
  4. Ability to do "g keyword keyword2" to search Google for those keyword(s) (Firefox just does this without the "g")
  5. Mouse gestures
  6. Single "Wand" password manager login for entire session (rather than per-window - see Firefox Cons below)

Firefox Pros

  1. Awesomebar searches within history allow minimal typing like "hu ec ve l ar" to pull up a long URL like https://build.eclipse.org/hudson/view/Athena%20CBI/job/cbi-ve-1.4.x-Ganymede/lastSuccessfulBuild/artifact/
  2. Ability to create keyword associations for bookmarks, so that "b 272403" will load https://bugs.eclipse.org/bugs/show_bug.cgi?id=272403
  3. Tons of plugins/extensions, including: mouse gestures, Twitter, Delicious, Tab colouring & detach/merge, ...

Opera Cons

  1. Location bar only works with URLs and sometimes page names. Way more typing needed than in FF
  2. No ability to undo the closing of a tab
  3. Cannot reproduce FF extensions in Opera; Delicious and Twitter integration are not nearly as good; no tab colouration, single view of downloads, no Tasktop support.
  4. Cannot store username/password pairs for in-page login forms (only browser-level ones). Repeatedly having to log in to JBoss Hudson every few hours is a royal pain.
  5. Crashes unexpectedly but previous session can be recovered.
  6. Lame icon with a dropshadow. Retro, sure, but c'mon, they've had that for AGES, and it's just lame.

Firefox Cons

  1. Memory bloat
  2. Crashes unexpectedly but previous session can be recovered.
  3. When reloading a saved/crashed session, every single page requiring access to the password manager pops a login dialog; sometimes I get to enter my password 7 or 8 times, or hit ESC repeatedly to lose those tabs.

I also briefly tried Firefox 3.1beta4, but as none of my extensions work there yet, it's not much better than Opera at this point. It's supposed to be better on memory, and has new bells and whistles being added to the Awesomebar. It's also supposed to be implementing a lot of functionality I get now from the above extensions, such as better tab management.

So, ultimately, I'm back to Firefox 3.0.10.

2009-05-04

Git 'Er Done

The discussions in bug 257706: Host a git repository on Eclipse Foundation servers, support git as the repository of Eclipse projects rages on, wind blowing in both directions.

Let's look at the objections to Git @ Eclipse:

Implementing a common build infrastructure would also be complicated by additional code repositories as well.

Not true; the Athena system already supports CVS and SVN, plus a "build from local sources" mode which works w/ a cvs/svn tree dump, a workspace w/ checked out projects, or (TBD, we haven't tested this yet) with a git repo. And we have an open bug to make repo tree structure irrelevant to the local checkout mode. Party on. There is even a Git plugin for Hudson so you can use Hudson to watch your repo for changes, like it does with CVS and SVN.

Can't use unapproved or non-EPL code at Eclipse.org

Not true; from discussions w/ legal@eclipse.org, I've been told at least twice that as long as you're not SHIPPING code that falls under a non-EPL or non-approved-CQ you're entirely fine to USE that code as server-based infrastructure. Rock on.

Cannot include tooling in a release train or host its project at Eclipse.org

Not true; since eGit is EPL and jGit is BSD, I don't see a problem with distributing the tooling that would connect to a Git repo hosted at Eclipse.org. We worked around the license woes for SVN tooling support. We can do it again. (Of course IANAL, TINLA.)

Conclusion:

No legal concerns with use of Git as hosted server infrastructure. Dash Athena (Common Builder) will support Git. Tooling is safe for inclusion in release trains (either fully like CVS is or partially like SVN is).

Only remaining issue is therefore to get it installed and allocate resources to support/manage it. With all the erosion going on lately, this need should not be trivialized.


Before the thaw this spring, this tree was on top of the bluff. With nothing to support it, it was dropped like an unchampioned feature request.

However, in the spirit of open source, several people on the above bug have offered to help w/ setup, testing, support, etc. So the burden here will be shared, like many things at Eclipse (eg., Babel, Hudson). Erosion continues, but we can all help to shore up the loss.

As most people prolly already know, Sourceforge supports the whole spectrum of VCS and DVCS options. If we don't want people to host projects there, Eclipse has to at least offer something from the DVCS world to encourage participation here. Keep the barrier to entry high, and people will go elsewhere. Lower the barrier, and people will come here to party instead.

With everyone feeling the economic- and time-pinch these days, can we really afford to discourage contributions at Eclipse simply because, as the silverbacks say, "why, back in my day, we only had CVS, vi, and notepad, and dangit, that was good enough!" ?

After all, the new world is inevitable.

2009-05-01

Just enough process

This entrace to a forested area in my neighbourhood used to be blocked by a fence with a chained gap wide enough to barely permit clearance for a bike, requiring me to duck to get through.

Recently, it was replaced with just enough of a barrier to prevent cars from getting in, without disrupting the flow or pedestrians, dogs, and cyclists.

Sometimes a whole new approach can vastly improve how a community can gain access to resources.