Archive for January, 2009

Published: 28 Jan 21:51 EST (02:51 GMT)

WASHINGTON – The United States and its allies might have to deploy up to 460,000 soldiers to North Korea to stabilize the country if it collapses and an insurgency erupts, a private U.S. study said Jan. 28.

The Council on Foreign Relations (CFR) think tank outlined what amounted to a worst-case scenario in the event the country descends into total chaos and foreign troops intervene after a failed succession upon Kim Jong-Il’s death.

The South Korean agency Yonhap, quoting “well-informed intelligence sources,” reported this month that Kim, 66, had named his third son, Kim Jong-Un, 24, as successor. Kim is reported to have suffered a stroke in August.

In its 37-page report entitled “Preparing for Sudden Change in North Korea,” the influential New York think tank outlined scenarios based on whether the succession is managed, contested or has failed.

“North Korea abuts two great powers – China and Russia – that have important interests at stake in the future of the peninsula. That they would become actively engaged in any future crisis involving North Korea is virtually guaranteed,” the CFR said.

The report did not rule out military intervention by foreign powers.

“The prospect of North Korea being absorbed by South Korea and U.S. forces potentially being deployed near China’s northeastern border are matters of acute concern,” the report said.

“The same fears helped trigger China’s entry into the Korean War. Moscow undoubtedly shares many of Beijing’s concerns, though Russia appears less poised to intervene should the situation deteriorate,” it added.

Foreign military intervention could create another dynamic.

“If former elements of the North Korean military, its security and intelligence forces, or its large special operations force were to resist the presence of foreign forces, the size of the needed stabilization force would escalate dramatically,” it said.

“In an insurgency, according to a Defense Science Board study, as many as twenty occupying troops are needed for every thousand persons, implying a force of 460,000 troops,” it said.

It pointed out such a force would be more than three times the number of U.S. troops in Iraq.

“Coping with such a contingency would likely be impossible for the South Korean and American forces to manage alone,” it added.

The report also raised concerns about North Korea’s stockpiles of weapons of mass destruction, including nuclear, biological, and chemical programs.

North Korea tested a nuclear device in 2006 but it has since been pursuing difficult negotiations with the United States, China, South Korea, Japan and Russia to scrap the nuclear programs.

“A possible breakdown over North Korea’s stockpile of weapons of mass destruction (WMDs) would likely provide even stronger pressures to intervene,” it said.

“If the cohesion of the military were to begin to fray, preventing leaking of WMDs, materials and technologies beyond the North’s borders would become an urgent priority,” it said.

“Although neighboring states share a common interest in preventing such leakage, serious differences could still arise over the necessity and execution of any military operation designed to secure WMDs,” it said.

As succession talk increases, the report urged close cooperation between the United States, China and other players in the region to help avert the worst.

Read Full Post »

The February 2009 National Geographic has a 26 page article as well as a photo essay about North Koreans fleeing political tyranny in their country and making the long and dangerous trek to South Korea through China and SE Asia. This article may explain a little about why I’m so interested in the North Korean humanitarian crisis. If not then this blog post certainly does.

Read Full Post »

I posted recently to show that working multi-touch input is available on the T-Mobile G1 phone. Now the necessary changes to the Android software stack are finally in good shape, and the software is easily installable on your own phone. (“Easily” being a relative term of course — you have to re-flash your phone’s firmware, and that may void the warranty…)

Keep reading for links to instructions for installing multi-touch support on the G1, as well as source code and screenshots for several multi-touch demo applications

Please donate to support continued development of awesome features for Android!
Please donate to support continued development of awesome features for Android!

Obligatory video

This video shows the multi-touch demo apps that are described below.

EDIT: Yes, these demos are a little slow/clunky if you’re used to the iPhone — but that’s not the point. The demo apps (and the hacked system apps, Browser and MapView) can be made a lot more responsive through the addition of:

  • OpenGL acceleration
  • kinetic zooming / inertia (so that “fling” is supported) — should be trivial to add to the multitouch controller, I just didn’t do it
  • and, in the case of maps and the browser, the addition of an API that is designed for partial zoom factors: the current API for both just suppports “zoom in”/”zoom out” not “zoom to 1.053x current zoom”, and starts the builtin jerky animation to get there, jumping in/out by a big step each time the animation has finished.

This is a proof of concept, people — of course it will be optimized over time — the fact it’s not done yet doesn’t mean it can’t be or won’t be. Apple’s engineers were paid mega moola to implement their bling, I was not. It’s actually not that bad if you try it yourself — and here’s the real point: you can get working MT on your G1 TODAY and it was never designed for that. Try it out and you’ll agree that MT zooming in the browser is a freakish lot better than the zoom button solution that Google implemented in the 1.0 release.


I moved my original multi-touch code back into the kernel, because it turns out that currently it’s a lot easier to patch the kernel and get a working Android system than it is to patch the Android java stack and get a working system. (The Android java stack that made it into the G1 was branched and polished long before the source code was released publicly, and the source code in git usually doesn’t run without problems due to being in a state of flux.) You can find the kernel patch to the synaptics touchpad driver here. Many thanks to zinx for helping to polish the kernel patch and figure out the best way to get mutitouch info into userspace.

I also patched the Android browser to support multi-touch scaling, source/diffs are linked below. The patched version also includes support implemented by JesusFreke for autorotating web pages based on phone orientation (you turn the phone on its side without even sliding out the keyboard, and the web page you’re viewing rotates) — you have to manually enable this in the Preferences to get it working though. See a list of known issues with the browser below.

Note that this approach to multi-touch is completely backwards- and forwards-compatible: you can write an app that implements extra features if multi-touch is supported on the phone. (The technical details: currently there is no way for an app to detect whether or not the phone has this multi-touch patch until the user initiates a multi-touch gesture, but at that point the size field of a MotionEvent in the onTouchEvent() method will have a value greater than 1.0. You can read multi-touch information by parsing this value, as shown in MultiTouchController.java in the demo apps below. Unpatched G1s will never generate a size value greater than 1.0, so you know that a size value less than or equal to 1.0 is a single-touch event regardless of whether or not the phone has been patched.)

EDIT: I received the following comment on this blog post, and wanted to highlight it, along with my response: “So let’s add this kernel patch to the mainline linux kernel, then google has no choice but to include it (I’m sure goog devs would love this). So google pushes this kernel patch with a OTA update, and everyone can use multitouch. I’m sure google would then try to delete all multitouch apps off the marketplace or they fight it out w/ apple over their bogus lawsuit.” My response: “Won’t happen, this is just a hack. The mainline kernel doesn’t need modifying, it already generates multitouch events, the event pipeline in the Android Java stack just drops the info for the second touchpoint. I hacked the kernel driver to hide info about the second touch point in the first touch point’s size field, so that the Java stack doesn’t need modifying. However ultimately the correct solution is to fix the Java stack to pick up the multitouch info and pass it on to userspace. This will require quite a lot of re-engineering though.”

Getting Multi-Touch on Your Phone

Unless you want to build and install your own patched Android kernel, the easiest way to get multi-touch on your phone is to get root access on your phone and then install JesusFreke’s replacement Android firmware image v1.41, available here:

–> Rooting your phone and installing JFv1.41 firmware images <–

The JFv1.41 images include the multi-touch patches to the kernel and browser to support multi-touch zooming, as well as numerous other useful bits and pieces (such as busybox, ext2 support, easy “su” root access, a data partition backup utility, and more).

If you want more detailed directions, you can find them here and here.

IMPORTANT NOTE: Flashing your phone carries a non-zero risk of voiding your warranty (although it is not clear whether or not this is true) and/or of bricking your phone (meaning you could render it unusable). If you re-flash your phone’s firmware, you take full responsibility for any damage that may result. Generally if you are careful, you won’t run into any major problems — but note especially that all user-data will be wiped from your phone when you first re-flash back to RC29 to get root access. Any data that is synchronized with the Google mothership *shouldn’t* be lost due to the fact that it’s all primarily stored on Google’s system, but other non-synchronized things *will* be lost e.g. your SMS message history, your Market-installed applications and their data, and probably other things. There are a few backup utilities on the Android Market, e.g. one to back up SMS messages. Once you have root and have patched your phone to include the engineering bootloader, then you can use the nandroid backup utility to backup files so that this does not happen in future.

Note also that OTA (over-the-air) updates from T-Mobile are generally disabled when you install community-generated alternative flash images like the ones above. This is to prevent T-Mobile re-flashing your phone with an image that takes away root access from you again. This means that when T-Mobile releases an update, you will need to wait a few days for someone in the community to produce an updated system image that includes the update. Generally (and increasingly, as the public source code repository becomes more stable), community images will be far more up-to-date anyway than what T-Mobile pushes out to you over the air, but there is of course a risk that something might break. JesusFreke has images based off of both T-Mobile official firmware and development (ADP1.1) firmware.


I have written several multi-touch demo apps for the G1. In most cases these are not intended for widespread use but to demonstrate to developers how to make use of the new multi-touch information that is generated by the kernel patch. Each app contains a copy of a class MultiTouchController.java that I wrote to simplify writing MT apps. Note that kinetic panning or zooming has not yet been implemented in the controller, but it should be simple enough to add.

Looks identical to the regular Android Browser -- but much easier to zoom on...

Looks identical to the regular Android Browser -- but much easier to zoom on...

Multi-Touch Browser

This is a patched version of the standard Android WebKit browser that supports multi-touch zooming. JesusFreke has kindly included it in his v1.41 image as described above, and you can get the source/patches here.


  • Multi-Touch zooming using “pinch” gesture like on the iPhone. Fully working and ready to use for daily browsing.
  • Some pages are not zoomable (even in the single-touch browser), and all pages have an upper and lower zoom limit. With multi-touch zooming, you are shown a new message when you cannot zoom into or out of a given page, in order to give the user more feedback as to why the pinch gesture isn’t working.
  • Also contains a patch by JesusFreke to support autorotate of the browser display based on phone orientation. You have to manually enable this in the Prefs.


  • When you zoom (even in the single-touch browser), the Android Browser changes the text wrap width of all major blocks of text on a page to fit the width of the screen, causing a reflow of the HTML page a second or two after zooming. With multi-touch, this causes the browser page to jump around as you are zooming — mostly noticeable on smaller pages that don’t take long to reflow, e.g. google.com “Classic”. EDIT: Note that there is an option “Auto-fit pages” in the Browser settings that you can turn off to disable reflow. This makes multitouch browsing much more responsive, because the browser is not constantly trying to reflow and doesn’t suddenly reset your view position when the reflow has completed, but of course if the text line width is greater than the screen width then you will need to scroll back and forth to read. It’s a tradeoff…
  • There is a bug in the current release of the (single-touch) Android Browser where the embedded WebView remembers its zoom setting across page views, but the BrowserActivity does not. Therefore the BrowserActivity may be thinking it is displaying a page at a different zoom than it actually is. If you zoom all the way into a zoomable web page like say cnn.com, and then browse to a non-zoomable page like google.com “Mobile”, then the embedded WebView will still be zoomed all the way in, but the BrowserActivity will tell you you can’t zoom out on the current page. This bug seems to be present whether you use multi-touch zooming or the zoom buttons, but it is a lot easier to run into on the multi-touch browser, because zooming is a much more pleasant experience so you’re likely to do it more 🙂 Somebody should look into this bug, it’s probably not hard to fix. If you find a fix please send it to me…
  • You need to keep your fingers at least an inch or so apart for the zooming to work well, due to the “snapping” problem that was shown in my first video, and in the part of the video in this post that shows the MTVisualizer application.

A rather boring screenshot of the MTMapsDemo application -- no fun unless you're actually using it 🙂


This is a very basic implementation of multi-touch zooming on Google Maps. It basically just wraps a WebView and displays your current location on the map, which you can then scroll around and zoom into/out of. You can get the source here or an .apk (compiled binary) that you can install on a patched phone here.


  • Display maps showing your current GPS location, jump to current location, pan, multi-touch zoom.


  • No directions, no searching of the map, and nothing else either — the main reason being, Google has chosen not to release their Google Maps app as open source for reasons that are beyond me, so you’re left with having to re-invent the wheel for every map application, and you have to work through with a WebView through an API that is very limited (at least from a navigation perspective).
  • There are issues with the map jumping wildly and/or scrolling locking up if you zoom out too far. This is due to inaccuracies in the inverse-Mercator projection calculations. Google doesn’t give you much of an API to work with in MapView, unfortunately, or much example code — so it’s hard to get a maps app working as well as their one does. EDIT: it is almost impossible to browse Europe or New Zealand for this reason.  I’m sure there’s a way to do this right using the API, it’s just complicated because the API is very small and limited, and converting between the different coordinate systems used (degrees, microdegrees, pixels, zoomed pixels, pixels-at-the-equator — as well as Mercator projection vs lat/long) makes this quite hard to do.  Somebody please look into it, this is just a demo 🙂
  • Additionally, I think there is an issue with the view jumping back to the GPS position sometimes when you’re zooming. Shouldn’t be hard to fix but this is only a demo of what can be done, it’s not intended for use as a real replacement for Maps… Some interested party should look into this, I am unlikely to keep developing this app myself. I hope somebody will pick this up and make a full-featured Google Maps replacement…

The photo sorter demo application

The photo sorter demo application


This is a basic photo management application that demonstrates how to use the MultiTouchController to handle dragging and stretching multiple different objects. You can get the source here or the .apk here.


  • Allows you to move photos around in a stack, reorder them, and stretch/scale them.


  • Limited to the built-in images. A simple improvement would be to load photos from the SD card.
  • Somewhat slow because it uses Skia, it would be better to code this up as an OpenGL app.
  • May occasionally run out of memory and crash — Android apps are limited to using 16MB on the G1, so image-manipulating programs have to be very careful about how they use memory.  EDIT: Happens especially when you slide open the keyboard.

The MT Visualizer app.  Touch points are at the endpoints of one of the diagonal lines.

The MT Visualizer app. Touch points are at the endpoints of one of the diagonal lines.


This is an app that allows you to visualize the multi-touch information, so you can see the limitations of the G1’s multi-touch screen (that it is a 2x1D device not a true 2D device, as I described in my previous post). You can get the source here or the .apk hereNOTE: For this particular demo, because the .apk file is very small, if installing directly on the phone you must download the .apk over wifi, not over 3G, because a bug in the browser causes the file to get truncated a couple of kb short.  The app installer in that case will say that it wants to replace “Android System”.  Some info is here.


  • Displays row and column of each touch point (vertical/horizontal lines), touch radius for multi-touch (big red circle), angle between touch points (with diagonal lines), midpoint between touch points (blue lines), pressure (yellow circle centered around midpoint).


  • This app shows that multi-touch on the G1 is limited in several ways — you get some weird snapping/popping effects etc. when touch points get close in one or other axis.  As a result, the G1’s screen is a much better device for multi-touch scaling than it is for multi-touch rotation or more general multi-touch input, because the distance between the two touch-points in multi-touch can be measured a lot more reliably than the actual position of the points.

Official support in Android?

There is no official statement from Google about planned multi-touch support in Android, although there is a driver in the kernel tree for an iPhone-like screen. In my opinion, it is unlikely that the G1 will ever get official support for multi-touch from Google in its lifetime, because it was never designed as a multi-touch device, and there are some limitations to multi-touch on this device. Google will also not accept upstream this specific workaround solution to hack multi-touch onto the G1, understanably so, as no multi-touch API has yet been proposed or agreed on. See this thread for more information.


Thanks to RyeBrye for his original work on multi-touch, JesusFreke for his awesome work on producing souped-up replacement images and for incorporating my changes into his latest release, zinx for helping polish the kernel patch, and DarkriftX and Disconnect for general awesomeness and help with documentation, testing and other aspects of this release.


Shameless plug: If you like this work then please feel free to use this link for amazon.com purchases.

Also if you don’t have a G1 phone yet, you can buy one using this link, or if you already have a G1, you can buy accessories here.


That’s it! This is all the work I ever expect to do on multi-touch for the G1, because I have more interesting ideas for apps that I want to spend my time working on. Watch this space. I hope other programmers will pick up multi-touch and run with it. Since this multi-touch solution is backwards and forwards compatible, there is no reason people can’t start adding additional multi-touch capabilities to apps that they release to the Market, starting today. When an official multi-touch API is available, apps can target that instead.

Have fun.

Luke Hutchison

Read Full Post »

Breaking news here. What this will mean for North Korea is uncertain.

Read Full Post »

The real story behind multitouch

(including screenshots, video and working code for functional multitouch on the G1)

Short story: I have full multitouch scaling and panning working in specially-developed apps on a stock T-Mobile G1 Android phone with a change to just one system classfile (i.e. with no modifications to the kernel whatsoever).

MultiTouch running on the G1 without kernel modification (red and green circles are drawn where touch points are detected)

MultiTouch running on the G1 without kernel modification (red and green circles are drawn where touch points are detected)

Long story: read on for full details, including a video of the action, and full source code so that you can run this yourself (assuming you are a developer and understand the risks of doing this — this is NOT yet for end-users).

Shameless plug: if you like or use this multi-touch work for Android, please donate to support continued development of awesome features for Android!
Please donate to support continued development of awesome features for Android!

For those with ADD or that don’t want to read the gory details, you can just watch the video on YouTube (it is also embedded below).


Touch screens and tinfoil hats

When the T-Mobile G1 / HTC Dream was released, it only supported single-touch rather than iPhone-style multitouch. Theories as to the lack of multitouch included hardware limitations, software support for it not being ready in the Android stack, and the threat of being devoured by Apple’s patent lawyers. Dan Morrill, a Google developer advocate for Android, made statements that the device was single-touch and the Android stack had no support yet for multitouch, but that Google would be willing to work together with handset manufacturers to develop multitouch software support when the hardware manufacturers were ready to release a multitouch handset. Eventually even one of HTC’s chiefs chimed in that the Dream was only ever designed to be a single-touch device.


Recently though, videos started surfacing on the net that showed various experiments people were performing on ListViews with two fingers that seemed to indicate the screen supported multiple touchpoints — however the results of these tests were still pretty inconclusive. Finally though, after the source of the Android stack was released, a developer Ryan Gardner / RyeBrye posted on his blog that he had managed to locate some lines in the kernel driver that were commented out that indicated that multitouch was indeed possible on these devices — and he hacked together a demo of two-fingered drawing that proved it.

To use RyeBrye’s solution, you have to recompile your phone’s kernel. It works by removing the comments around some debug statements (lines 132-151 of the the Synaptics I2C driver, synaptics_i2c_rmi.c) that dump motion events out to a logfile. He then wrote a user interface to read the logfile and draw dots on the screen.

Google, of course, continued to remain silent on the multitouch issue, and conspiracy theories grew thicker…

Enabling multitouch on the G1, the real way

RyeBrye did a great service to the Android hacker community by demonstrating that the screen is multitouch-capable. However there are some real limitations to his approach (which he fully acknowledged), such as having to recompile your kernel and having to get at the events by parsing a logfile. Also it looks like nobody yet has picked up the ball and turned his work into a working system.

Actually, it turns out that if you read a little further down in the driver code (lines 187-200 of synaptics_i2c_rmi.c), you’ll notice that you don’t need to recompile your kernel at all to get multitouch working on the G1 — the kernel driver in fact already emits multitouch information! The driver emits ABS_X, ABS_Y and BTN_TOUCH values for position and up/down information for the first touchpoint, but also emits ABS_HAT0X, ABS_HAT0Y and BTN_2 events for the second touchpoint. Where are these events getting lost then?

I pulled apart the Android stack and scoured it for the location where these events are passed through to Dalvik through JNI. It turned out to be very difficult pinpoint where input events were getting received MotionEvent objects populated (because they are processed on an event queue, the objects are recycled rather than created, and it happens in non-SDK code — egrep wasn’t much help either). The exact point at which multitouch information is lost though turns out to be $ANDROID_HOME/frameworks/base/services/java/com/android/server/KeyInputQueue.java. This class is the only code running on Dalvik that ever gets to see the raw device events — and it promptly discards ABS_HAT0X, ABS_HAT0Y and BTN_2. (It doesn’t seem to do so intentionally or maliciously, it just ignores anything it doesn’t recognize, and it is not coded to recognize those event symbol types.)

Now we’re getting somewhere. I recompiled the whole Android stack and tested detecting these events, and sure enough, I could now detect the second touchpoint — without recompiling the kernel (but, unfortunately, after having to modify part of the Android Java stack).

Two touch points being detected, with blue bars indicating the column and row of each touch point

Two touch points being detected, with blue bars indicating the column and row of each touch point

Implementing functional multitouch on the G1 in a backwards-compatible way

I wanted to find a way to pass multitouch events through to user applications in a way that was as minimally invasive as possible, i.e. that didn’t require a major replumbing of the whole MotionEvent system, and that was backwards compatible with single-touch applications. It turns out that there is a field in MotionEvents, “size”, that does not appear to be used currently. It is actually mapped to MotionEvents’ size fields from the ABS_TOOL_WIDTH attribute emitted by the Synaptics driver — however the value seems to be ignored by the Android UI, and the value seems pretty chaotic. I suspect the driver actually uses it to represent some attributes of a tool used on similar Wacom-style tablet devices.

Anyway the driver specifies that ABS_TOOL_WIDTH can be in the range [0,15] (and this is mapped to the range [0.0,1.0] when it is placed in the size field), so we have four spare bits in each motion event that are unused. I modified KeyInputQueue.java to generate either one or two motion events depending on whether or not BTN_2 was down, and then marked each event with a bit (bit 0) signifying whether the event was for the first or the second touch point. I then used two more bits to attach the two touch point up/down states to each motion event, BTN_TOUCH and BTN_2, so that individual touch states of the two buttons could be known from either event type, and then, for backwards-compatibility purposes, I set the button-down state of each generated event to the state of (BTN_TOUCH || BTN_2). This is done to keep the semantics of the button-down status of MotionEvents consistent with what the event pipeline would expect, specifically so that the up/down status doesn’t alternate between emitted events.

The result is an Android stack that behaves normally for single-touch, generates events that can be separated into two streams by multi-touch-aware applications, and at worst only generates a series of events that appear to jump back and forth between two points on the screen when two fingers are touched to the screen in a single-touch application — e.g. if you are using a standard listview and hold down two fingers, the list will just jump up and down between the two fingers as you move them around.


Here is a video of a multitouch application that I wrote to exercise the modified Android stack.

The REAL reason for no multitouch on the G1 at time of release

Note that I mention in the video that the multitouch screen for some reason “was disabled at the time of release”. I do not at all believe this was an intentional curbing of the phone’s functionality — it just (1) was not in the design specs to have this feature for the first phone release, (2) would not have been ready in time (the hardware support for it is not polished, and the software support not started in the G1), and (3) was not central to the core mission of what Android was trying to achieve. Honestly having looked through some of the ENORMOUS mass of source code in the Android stack, I don’t have any idea at all how it was all pulled together in time for release, and how the release happened with so few 1.0 problems. The Android software stack is an incredibly well-engineered and well-brought-together stack — and it exhibits some amazing engineering and some amazing project management that all the pieces could have been developed separately and finally integrated into a single working product in such a short time.

As it is probably clear from the video, there are some technical challenges to making multitouch work on this hardware. The main technichal problem is that the Synaptics screen is not a true 2D multitouch device. It is a 2x1D device, or contains two sets of orthogonal wires and firmware for analyzing the resulting two 1D projection histograms of capacitance across the screen. This leads to a number of problems, in approximate decreasing order of severity:

  1. When there are two touch points on the screen separated diagonally, there are two peaks in each projection histogram, but the hardware has no way of knowing if this represents a forward diagonal configuration or a reverse diagonal configuration. As a result, points that are being tracked can swap over each other (hard to explain, see the video).

    An example of the touch points crossing over each other

    An example of the touch points crossing over each other

  2. When points get too close together in one dimension, their histogram peaks merge together in that dimension, giving an undesirable “snapping” of the points to each others’ ordinates (one of the two coordinates). The radius of touch points on the screen is quite large (because the peaks in the projection histogram have to be quite well separated to be counted as separate peaks), so when fingers get close together, both points can merge into a single point, meaning your fingers can’t start really close together in a “zoom in”/”pinch-out” gesture.

    An example of "snapping" when two points get too close together horizontally or vertically (regardless of their separation in the other dimension)

    An example of "snapping" when two points get too close together horizontally or vertically (regardless of their separation in the other dimension)

  3. If the second finger is kept down and the first finger is lifted, then suddenly the second point’s location is returned in the first motion event (this may cause problems for application writers)
  4. The thresholding algorithm in the hardware is not calibrated well, so in multitouch mode the peak-detection threshold is slightly different for the two axes, and points can “lose an ordinate”, jumping across to align with the other point in one of the axes. This gives very messy sudden motion events when the finger is placed down and raised.
  5. Several smaller problems also exist, such as adding a second finger decreases the overall pressure measurement returned in the event, because pressure has not been correctly calibrated for multitouch.

These problems, especially the first two, are serious for general multitouch usage. This is almost certainly one of the biggest considerations behind the decision to not support multitouch on the G1. (And there is probably a financial reason, patent worries or other. There’s always money involved in anything you don’t understand…) The biggest problem, the inability to distinguish between forward and reverse diagonal configurations, means that general multitouch gestures involving rotations simply won’t work in the general case. (But see motion estimation workarounds below.)

The good news

Actually though it turns out that you don’t need rotation gestures for most multitouch operation that people would be interested in, because we work mostly with axis-aligned documents — maps, wordprocessing docs, web pages… and as long as your fingers are not too close together in either axis, you can get all the info and resolution you need for iPhone-worthy zooming and scrolling from the G1’s hardware events.

Scaling a map (at least, the image of a map) -- note that the points have inadvertently swapped, but the scale factor is still chosen correctly

Scaling a map (at least, the image of a map) -- note that the points have inadvertently swapped, but the scale factor is still chosen correctly

Additionally the G1’s touch screen has a slight advantage for two-fingered (axis-aligned) touch gestures, such as sliding two fingers down or across the screen: if the two touch points are almost aligned in one axis, it locks them into alignment, making two-fingered gesture detection more natural (ok, that’s a stretch 🙂 )

Scaling an image, with points snapped horizontally.  Scale factor is not affected too dramatically by point snapping, because the distance between snapped points and actual finger positions is fairly similar.

Scaling an image, with points snapped horizontally. Scale factor is not affected too dramatically by point snapping, because the distance between snapped points and actual finger positions is fairly similar.

As is demonstrated in the video, the system should work fine for zooming and panning maps and web pages.

It turns out that the multitouch events generated by the driver are very noisy (i.e. not well tested or polished). I had to do a lot of complicated polishing of event noise to get the system usable to this level. As well as the problems with loss of accuracy around axis-crossings as described above, quite a number of events can give wildly inaccurate X and Y coordinates just after and just before a change in up/down state. There is still a little more tuning and polishing that needs to be done, but the code is below if you want to play with it and improve it.

What can be done to fix or work around the remaining problems

The system could be made more natural to use by building in motion estimation (inertia and damping) in the vincinity of the discontinuities where touch points cross over each others’ axes, so that if the user is in fact doing a rotation gesture by moving strongly towards the axis crossing point, events will continue to be generated that smoothly cross that point. Of course there is still the potential for error here though if the user stops or reverses direction.

Getting and running the code

So I mentioned that you wouldn’t have to recompile your kernel… but you still have to recompile one system class of the Android java stack, or all you can do with the demo code is operate one touch point as normal (i.e. just drag, not stretch).

Unfortunately the version of the Android stack that made it onto the G1 was derived from a snapshot of the code taken quite a while before Android 1.0 was released, so you can’t just patch the one class, recompile that class’ .jar file, and re-install a single .jar on your phone — that .jarfile, built from the publicly-available Android 1.0 source (or, worse, Cupcake/1.1), won’t likely work with the rest of the .jar files on your phone. So for now you need to build the entire 1.0 stack with the patch and then flash your entire phone.

Note the following:

  • The Android 1.0 source in git builds a system that is a little bit broken in a lot of ways. Expect things not to work, and as a result expect multitouch to not be available on your primary phone until someone produces a more polished release from source that you can use.
  • Cupcake is still not ready, it is very broken right now. Use 1.0, don’t use Cupcake.
  • If you try this, you take full responsibility for anything that goes wrong, and if it breaks you get to keep all the pieces. You agree to not hold me responsible in any way if you lose important data or brick your phone, or if anything else goes wrong.
  • This is not yet ready for mainstream. If you are not a developer then wait until someone develops a working system that you can use easily.

Steps to follow:

  1. Get the Android source here
  2. Get my modified KeyInputQueue.java and overwrite the original in the Android source at $ANDROID_HOME/frameworks/base/services/java/com/android/server/KeyInputQueue.java .
  3. Get root on your phone, build the whole patched Android stack, and flash it onto your phone by following these instructions (except that you should use the 1.0 branch in git, not the cupcake branch). You could consider using JesusFreke’s RC30 v1.3.1 instead of v1.2 that is specified in those instructions. NOTE: all of these steps are highly dangerous to your phone, you must know what you are doing before you attempt this, and you agree to take full responsibility if anything breaks.
  4. Download and run my demo application which receives the patched events and splits them into separate events for each touch point. (This is the application that is demoed in the video.)

Using the demo application

  • Roll the trackball left and right to switch between the two views
  • Press the trackball down (center-press) on either screen to toggle extra debug info. (Debug info starts “on” on the first screen and “off” on the second.)
  • All other interaction is performed by dragging one or two fingers on the screen.

Future development

There is considerable work that could be done to polish this and tweak it for optimal usage. A lot of the demo code (event noise smoothing etc.) could be moved into the Android stack, and motion estimation could be added to this to make things smoother. There are still sometimes glitches when you lift one finger off the screen after a multitouch operation, as well as when one finger hits the edge of the screen (due to some edge-logic in the lowlevel driver, I think).

Getting this patch upstream is probably unlikely, because ultimately this is a hack, especially the hijacking of the MotionEvent size field — but the actual impact to single-touch applications is very low: just some weirdness/jumping when you have two fingers on the screen. Note though that the G1’s default software stack has its own weirdness here (as the very first grainy “we think there’s multitouch on the G1” YouTube videos showed), and because of the hardware event noise when you lift one finger from a multitouch event.

I suggest someone write a .odex editor tool that can selectively excise one class from a .odex tool and replace it with one from another Dalvik-compiled class — then “all” that you would need to do to get multitouch on your phone would be to get root and then patch your system. Everything else should keep working as normal.

Ideally someone would then graft this patched .odex file into JesusFreke’s RC30 image, so that all you had to do was reflash your phone and you’d have a phone that is full working, but with multitouch support too. (At the moment it’s either-or…)

I want to also put out there a challenge for someone to build a MultiTouch frontend for Google Maps and WebView. In the demo, I just scale static images of a map and a webpage.

You can also use my code if you need a testbed to start developing your own multitouch software, so that you’re ready for the day that multitouch is officially supported by Google.

I am unlikely to do any more with this code myself, I just had to show it could be done 🙂

Final word

Please don’t sue me, Apple.

That’s it! Have fun.

Please discuss among yourselves in this Google Groups thread.

— Luke Hutchison, 2008-01-10
(If you want to know where to send the check 😉 , you can email me at the domain name “mit dot edu” preceded by: my first name, then dot, then the first five letters of my last name, then “at”. Please don’t ask me how to get this installed and working on your phone though, it’s not ready for end-users and I cannot respond to individual queries about installation.)


UPDATED 2008-01-12

An even better solution, and a way to get multitouch on your phone tomorrow (or whenever I get the time to do this…)

I just realized that I can patch the event system to work in a way that even Google would probably be happy with, i.e. in a way that will probably allow this patch to make it upstream. All I need to do is send *one* MotionEvent for multitouch events, with the X and Y coords set to the midpoint between the touch points, and the size field of the event set to the distance between the touch points. The size would be zero for single-touch events. This would allow the application writer to simply detect event.getSize()>0.0f, and initiate a scale operation. This also dramatically simplifies application code, because you don’t have to deal with two events, and cuts down on the number of events sent through the event pipeline.

This solution only allows for scaling, not rotation (I would need an additional event field “angle” to pass along rotation information), but it is sufficient to get multitouch working in all applications *today*, in a way that doesn’t break anything at all — and rotation won’t work well on this touch screen anyway, as the video points out.

Since it is almost impossible to build a working Android Java stack right now (as the source snapshot that made it onto the G1 was never released as a unit to git, it was taken from the state of source several months before the release of the 1.0 source and then polished extensively), it turns out that the simplest place to implement this is actually in the C kernel driver. (It is easy to get unmodified Android running on a modified kernel.) Then you would only have to put together a flashable system image with a custom kernel and an *uncustomized* Java stack taken directly from the G1 (i.e. RC30), and the chance of things breaking is much lower. This will also be a lot easier to redistribute so that we can get lots of people out there with working multitouch on their phones, and start pushing multitouch apps out to the market — without waiting for Google to upstream a patch (or for T-Mobile to pick up Google’s patch and OTA it, which may take until the next ice age).

I will cook up a kernel patch soon and see if JesusFreke will do a system image release with my patch.

Read Full Post »

My semester-final Chinese essay, followed by the Google machine-translation (which ends up hilarious — Computer generated Chingrish!)




住在现代的人陷入生活的疯狂竞争。对大家来说,找到幸福之秘方是追求的目标 ,但是他们同时又觉得,”我一爬过了下一座山就会得到幸福”。他们不觉醒的是下一座山不是小山而是参天的。到了仿佛是顶的地方的时候他们才发现永无止境,顶还远。



问题就是我们看幸福的角度。幸福不在于终点而在于过程。对大多部分的人来说,成就不是目的,幸福是目的。成功也不是得到幸福之手段,不快乐的成功的人多的是。说实在,“生活很像坐火车的过程——有耽搁、绕道、烟雾、灰尘、煤屑、颠簸等,只偶尔有漂亮的山水及享受快速的移动。重要的是感谢上帝给你坐。”(-Gordon B. Hinckley)



那么,这个问题的解决到底在哪儿?“不要问世界需要什么。问一下什么令你活跃起来。然后去做这件事清,因为世界需要活跃起来了的人。”(Harold Whitman)我们的生活的时间是一种金钱,而且我们做的所有都花生活金钱,有时花巨额。生活必须有所取舍,怎么花我们有限的事件给我们带来挑战。恐怕很多人的经费习惯在于浪费在不必要的事情。




The Pursuit of Happiness

Although the current era is always the most peaceful and prosperous times, but according to the statistics of modern people with no recent times more than unhappy. What is the reason?

People living in modern life into the frantic competition. For all of us to find the secret of happiness is the goal pursued, but they also feel that “I climbed a mountain the next will be happy.” They do not awaken the next hill but a mountain is not the towering. To places like the top of the time they found an endless, far from the top.

We are full with the “achievements will be happy to be” over the imagination of our lives. Therefore, we effort to striving for success is always our goal to bring to fulfillment busy exhausted. But to find it difficult to achieve only our tired. How, if we in the pursuit of happiness and at the same time there is no spirit or did not have time to enjoy life’s good, not very contradictory Ma?

In that case, bring happiness in life, what is? If happiness does not lie in the pursuit of success, then why are we strive? Why do not we live in. We find happiness? Why do we always insist on the pursuit of happiness does not give us any thing?

Problem is that we look at the point of view of happiness. Happiness is not the end but in the process. For the most part of the people, not an end in success, happiness is the goal. Nor is it to be a successful means of happiness, the success of unhappy people who are numerous. To tell the truth, “much like to ride the train of life of the process – there are delays, Bypass, smoke, dust, cinder, bumps and so on, only occasionally have a beautiful landscape and enjoy the fastest mobile. Is important to thank God for you to sit.” (-Gordon B. Hinckley)

Recurrent dreams can not be rooted in childhood, we would like to become a hero on the desire. We can not grow up to overcome this desire, would like to receive the applause of others. Aspiring people often feel a need to do important things-money, otherwise they will not satisfied with worry. But the important thing to do after they found their happiness and the people do not align compliment. Repeated pursuit compliment to meet their own sense of loss will only make people若失, is caught in a dilemma – but is only repeating the same mistakes.

The results, at present, many people can not悠游freely enjoy their life. Just imagine, ranking the leading position of university students a few students willing to live in a small town in rural areas, have been unknown to ancient life? Most people would feel to miss this opportunity.

In that case, the solution of this problem where children? “Ask not what the world needs. Ask what makes you become active again. And then do the clearance, because the world needs an active person.” (Harold Whitman) the time of our lives is a kind of money, and we do all have spent money to live, and sometimes spend huge. Life will be trade-offs, how to spend our limited events to our challenges. I am afraid many people accustomed to the funding is wasted on unnecessary things.

The direction of our lives do not determine to a great decision, and the decision to the daily life of tiny identified. In addition, the secret of happiness is human relationships. In other words, do not attach importance to others, did not express the gentle love of family, how can we talk about the achievements in the world or compliment it?

In short, we need to reverse the world’s told us it would bring happiness – are very wrong. To this end, the need to decide how to spend money to live a significant juncture, we should consider buying the most valuable pearl of life – the development and love of their loved ones.

Read Full Post »