somewhere to talk about random ideas and projects like everyone else

stuff

April 2010 Archive

Ajax Animator iPad Support 11 April 2010

Today I went to the magical Apple Store and tried out the iPad for the first time. I really have to say that it’s quite magical, though it doesn’t fulfill the criterion for Arthur C. Clarke’s Third Law despite what Jonathan Ive says. Though I really haven’t tried any large area multitouch interface before (sadly), and I would expect it to be a somewhat similar if not exact replica of the experience. Keynote and Numbers were pretty neat (I suck at typing on the iPad in any orientation, so I don’t like Pages). That’s enough to show that iPad is not just a content consumption tool as the iPod and iPhone primarily are, but also content creation.

Anyway, in a few minutes I just swapped the mousedown, mousemove, mouseup events with touchstart, touchmove, touchend events respectively in the core of VectorEditor, while adding a new MobileSafari detection script (var mobilesafari = /AppleWebKit.*Mobile/.test(navigator.userAgent);) and in a quite analogous “magical” way, VectorEditor works in iPhone/iPod Touch and theoretically iPad, Just dragging the vectoreditor files over to the Ajax Animator folder and recompiling should bring iPad support to Ajax Animator with virtually no work.

I haven’t tested it. Downloading XCode 3.2.2 right now so hopefully I can test it soon. Stupid how it’s what? 2.31 gigabytes?!

And possibly, I could use PhoneGap to hack together a App Store app which does the same thing (and maybe charge for it, which might be a bit cruel as this application is open source and works equivalently online - but I guess that’s okay for those people who don’t read my blog >:) ). Maybe get enough to buy an iPad :P

Anyway, though I’m pretty late to this and my opinion probably doesn’t matter at all, here’s a mini iPad review: It’s really really cool, feels sort of heavy, really expensive, hard to type on in any orientation (interestingly it has that little linke on the f and j keys with the keyboard which feels useless since I always thought the point of that was so you can tactile-ily or haptically or tactically or whatever the right word is, find the home row, but since there’s no physical dimension to an iPad, it just strikes me as weird and wanting of that tactile keyboard). Otherwise, browsing really really feels great. Only thing I miss is the Macbook Pro style 3 finger forward/backward gestures (@AAPL plz add this before iPad2.0, and also, get iPhoneOS 4.0 to work on my iPhone 2g or at least @DevTeam plz hack 4.0 for the 2g!).

Oh, and for those lucky enough to have a magical iPad, the URL is http://antimatter15.com/ajaxanimator/ipad/ at least until there’s enough testing to make sure that I didn’t screw up everything with my MobileSafari hacks.


Idea: Lego Mindstorms IDE for iPad 08 April 2010

I don’t have an iPad, nor is it #1 on my wish list (It mostly means any tablet platform but since none of the other ones are really recognizable, I’m jumping on the 4-letter apple product bandwagon). But I am fascinated by touchscreens.

I started programming when I was 7 when I got my first Lego Mindstorms RIS/RCX 2.0 kit (and I loved the 13+ sticker on the box back then :P). So I’ve always had a fondness for the platform, it’s really great for getting kids into robotics and engineering. Kudos to Lego.

Recently, I’ve played around with the current rendition of the Mindstorms platform, the NXT. It’s an evolutionary advancement for the platform and maintains the original intuition of the system while catering to those who don’t really grow out of the original system.

The interface is, a very kid-friendly drag-and-drop block layout. I actually sort of like it, though it’s not something which a desktop application could easily be made in. It’s very procedural, and that’s well suited for telling a car to explode and magically arrange red and blue balls into designated corners.

But really, where drag and drop really shines, the place where it really is meant to be, is on a multitouch tablet. It just makes sense. On a large multitouch surface, coding using simple finger gestures and dragging just makes sense. Lego’s own Labview interface, called NXT-G has large icons and is built entirely on the dragging and dropping. Its something that just feels right on a touchscreen.

The gestures need to be tailored to the specific platform, I propose that two fingers, like on a Macbook, should be used to pan around the canvas of the code. Blocks are dragged from a list on the side onto an execution path. On a block already on the canvas, touching and dragging does the logical thing: it moves the position. Touching a block on the canvas without dragging makes a pie menu type display ooze out from the block. The list would be a bunch of output “pipes” which another finger can be used to drag and link onto other blocks which display another pie menu (though only showing inputs rather than outputs) where letting go would create the connection.

Implementation-wise, one could try porting NBC/NXC (which is written in Pascal and already has the makefiles for WinCE/ARM and FreePascal does seem to be able to compile to iPhone/iPod Touch and the iPad should be a virtually equivalent platform). Probably something made in SVG and/or <canvas> could be used to create the interface which can be loaded with a UIWebView or using the PhoneGap platform. Then it would convert the graphical representation into some NXC code, compile it, and use the built in Bluetooth 2.1 + EDR support in an iPad to send it to the Lego NXT brick and do magic.