JSON

What I Did on My Winter Holiday

This winter break (or as we used to call it when I was still in school, Christmas vacation), I worked on a little web application. The application was going to be for an automotive related blog that I write on, but once I started working on it I realized that the application was better if it wasn't restricted to just automotive information.

The application is called scovry and it is a web discovery application. You may notice that we at Ajaxonomy love spy applications (notice that in Ajaxonomy labs we wrote delicious Spy and TubeSpy). scovry is kind of a blend of these spy applications along with gathering information from many other social media sites and social networks. Beyond this the site adds a social element by allowing comments and easy sharing of items found on the site.

scovry

From a programming side, in order to load all of the data and keep the server happy I used a lot of caching. The caching code I used was based on the Easy Server Side Caching in PHP article that I wrote a while back. With some small changes, the biggest of which was using readfile() instead of include(). I even modified the caching script to cache images that are loaded from the thumbnail service to make loading much faster (not to mention to reduce requests for images).

You can check out scovry here.

JSON 3D - The Time is Near!

3D on the web is getting very close to being a reality. With the release of Google Chrome 9 and Firefox 4, that uses WebGL (a subset of the OpenGL library) coupled with HTML 5 and JavaScript, 3D on the web could soon become used in many websites.

With 3D likely becoming mainstream on the web within the next few years, I thought that now is the time to build tools to model 3D models using a small easy to handle format that could be easily used in JavaScript.

You may remember my post from a while back about JSON 3D. I feel that using JSON for 3D modelling would be the best format to handle 3D models on the web. This would allow for quick Ajax calls to allow loading 3D models as needed on a page and the format is about as small as you can get on the web for holding this type of data. Additionally JSON already has a huge footprint in Ajax development and has virtually replaced XML in being used for most Ajax calls.

You can read my original post on JSON 3D here. My hope is that some of you will find this useful and will begin to write the tools that will be used to have 3D available all over the web.

Also, if you are looking to have some fun, you can view a bunch of Chrome WebGL 3D demos here.

JSON Beats XML, or Ajaj vs Ajax

Tagged:  

Should the Ajax term be changed to Ajaj? Since Ajax stands for Asynchronous JavaScript and XML, but with more and more the web favoring JSON (especially in rich Ajax applications), should it now be called Ajaj and stand for Asynchronous JavaScript and JSON?

The reason that I bring this up is that James Clark one of the major contributors of XML is now saying that JSON is now the way of, at least, the "Cool" web. The post where James talks about this is called XML vs the Web.

Below is an excerpt from his post.

If other formats start to supplant XML, and they support these goals better than XML, I will be happy rather than worried.

From this perspective, my reaction to JSON is a combination of "Yay" and "Sigh".

It's "Yay", because for important use cases JSON is dramatically better than XML. In particular, JSON shines as a programming language-independent representation of typical programming language data structures. This is an incredibly important use case and it would be hard to overstate how appallingly bad XML is for this. The fundamental problem is the mismatch between programming language data structures and the XML element/attribute data model of elements. This leaves the developer with three choices, all unappetising:

* live with an inconvenient element/attribute representation of the data;
* descend into XML Schema hell in the company of your favourite data binding tool;
* write reams of code to convert the XML into a convenient data structure.

By contrast with JSON, especially with a dynamic programming language, you can get a reasonable in-memory representation just by calling a library function.

Norman argues that XML wasn't designed for this sort of thing. I don't think the history is quite as simple as that. There were many different individuals and organisations involved with XML 1.0, and they didn't all have the same vision for XML. The organisation that was perhaps most influential in terms of getting initial mainstream acceptance of XML was Microsoft, and Microsoft was certainly pushing XML as a representation for exactly this kind of data. Consider SOAP and XML Schema; a lot of the hype about XML and a lot of the specs built on top of XML for many years were focused on using XML for exactly this sort of thing.

You can read the full post here.

While I don't think that XML will ever totally go away (nor should it), I do think that in most Ajax applications that JSON makes more sense to use (with the biggest exception being if you just want to display the data). So go out there and start using Ajaj!

Putting the Google Base API to Good Use - Part 1

Tagged:  

A while back I wrote a post about creating a Product search using the Google Base API. We'll I've made a few examples for another blog that I've been writing on called The Porsche Guy's.

Since the blog focuses on Porsche related news and projects (I happen to have an older Porsche 944, so enjoy talking about the cars) I created two Porsche related searching tools. The first is a Porsche Parts finder that makes it easy to find the best prices on the Web for Porsche parts (you can go to the Porsche Parts Finder here). The second is a Porsche Finder, so it you are looking for a good deal on a Porsche then this tool will find the best on the web (You can see the Porsche Finder here).

My next post will go into detail as to how exactly I created these searching tools including code snippets, so look for part 2 of this post. Until then check out my first post about creating a Google Product Based Search Application here.

How to Make a Search Based on Google's Product Search

You may have seen the Google Product Search and may have thought that it would be useful to include a customized version of the search into a website or application. Unfortunately, you can't just create a custom search engine based on the Product Search using Google's custom search creator.

So, how would you incorporate the Product Search into an application? The answer is to use the Google Base API. The API allows for you to call a feed and if you use the [item type:products] option it will use the Product Search data.

The API allows you to receive the feed in Atom, RSS and JSON formats. Google has also made it very easy by making a feed URL builder (you can access the builder here).

So, if you want to make an application using the Products Search now you can.

ECMAScript 3.1 Final Draft Emerges

Tagged:  

Also known as ECMAScript 5th Edition, the new JavaScript standard has entered final draft stage. Among the goodies: a formal getter and setter syntax for object properties, language reflection features, support for the JSON data format, additional Array methods, and a strict mode that improves error checking.

Function.bind

Function.prototype.bind(self, args...). A bind function wraps a function in a closure, storing references to the context arguments from the surrounding scope. This is somewhat equivalent to the following:

Function.prototype.bind = function(context) {
  var fun = this;
  return function(){
    return fun.apply(context, arguments);
  };
};

Applications include partial application of arguments to a function and currying. Though you can custom-roll one today, a native bind function in 3.1 should outperform any equivalent user-defined function.

Array

The additional Array methods in ECMAScript 3.1 are identical to methods introduced in JavaScript 1.6-1.8, but were never present in any official ECMAScript specification. They are currently implemented in Firefox 3.x. Of course, having them in ECMAScript 3.1 means that now you will be able to actually use them (provided, of course, that all browsers implement the standard...). These methods are: indexOf, lastIndexOf, filter, forEach, every, map, some, reduce, and reduceRight. There's a good description of each method here.

ECMAScript 3.1, also known as JavaScript Harmony, is the less ambitious version of what was to be JavaScript 2/ECMAScript 4, a plan scuttled when some members of ECMA balked at the large additions to the language.

The new specification is available here.

JSON - 3D Proof of Concept

A few weeks ago I wrote a post on the concept of using JSON for creating 3-D models (You can read the first JSON 3D post here). I have created a proof of concept based on MooCanvas to allow for IE support. This proof of concept is also based on the 3D cube demo for MooCanvas with some modifications.

For this proof of concept the important function is Load3D which loads and translates the JSON 3D object.

 		function Load3D(obj, translateX, translateY, translateZ){
			scene.shapes[(obj.type + ObjectCounter + "")] = new Shape();
			var p = scene.shapes[(obj.type + ObjectCounter + "")].points; // for convenience
			for(var a=0; a<obj.vrt.length; a++){
				p[a] = new Point(((obj.vrt[a][0])+translateX), ((obj.vrt[a][1])+translateY), ((obj.vrt[a][2])+translateZ));
			}
 
			// Create Shape From Points
			for(var a=0; a<obj.fac.length; a++){
					scene.shapes[(obj.type + ObjectCounter + "")].polygons.push(new Polygon(
					[ p[obj.fac[a][0]], p[obj.fac[a][1]], p[obj.fac[a][2]], p[obj.fac[a][3]] ],
					new Point(obj.nrm[a][0], obj.nrm[a][1], obj.nrm[a][2]),
					true /* double-sided */,
					Polygon.SOLID,
					[obj.mat[a][0], obj.mat[a][1], obj.mat[a][2]]
				));

			}
			ObjectCounter+=1;
 		}

For the JSON 3D object you would pass it into the Load3D function. While in this example I have put the JSON 3D object in the HTML, you would load the JSON 3D using Ajax (or a script tag) in the real world.

			var ThreeDobj = {
			"vrt":[[-10,-10,-10],[10,-10,-10],[10,10,-10],[-10,10,-10],[-10,-10,10],[10,-10,10],[10,10,10],[-10,10,10]],
			"fac":[[0,1,2,3],[4,5,6,7],[2,3,7,6],[2,3,7,6],[0,4,7,3],[1,5,6,2]],
			"nrm":[[0,0,-1],[0,0,1],[0,1,0],[0,1,0],[-1,0,0],[1,0,0]],
			"mat":[[70,70,70],[80,80,80],[80,80,80],[75,75,75],[70,70,70],[70,70,70]],
			"type":"cube"}
			Load3D(ThreeDobj, 0, 0, 0);

You can see the proof of concept here.

The proof of concept has been tested on Google Chrome, FireFox 3 and IE 7. It may work on other browsers (it should work on Safari and Opera), but has not been tested.

JSON - 3D

Tagged:  

Recently I have been looking at 3D as it pertains to the web and controlled through JavaScript without the aid of plug-ins (this is a topic that interests me as in my early programming I created a lot of 3D applications). While Canvas is currently good for 2D rendering (on most browsers and IE with a little help from Google) we are still a ways off from cross browser Canvas 3D support. While I have found a few 3D engines written with Canvas, they all seem to bomb on IE (even with Google's IE Canvas script).

Even if a good 3D solution was available for the web we still have the issue of being able to load all the models for a scene. This got me thinking a bit about how the loading of scene information could be made to work on the Web.

The concept that I have come up with is fairly simply. You would load a map of the scene (this map may could be stored in JSON and could be made to work as a BSP [Binary Space Partitioning] tree) and on the map you would have various check points. Each checkpoint would load the needed models using JSON.

Below is an example of what the JSON for a cube may look like.

{"obj":[{"vrt":[[-5,-5,5],[5,-5,5],[-5,5,5],[5,5,5],[-5,-5,-5],[5,-5,-5],[-5,5,-5],[5,5,-5]],"fac":[[0,2,3,1],[3,1,0,1],[4,5,7,0],[7,6,4,0],[0,1,5,4],[5,4,0,4],[1,3,7,3],[7,5,1,3],[3,2,6,5],[6,7,3,5],[2,0,4,2],[4,6,2,2]],"nrm":[[0,0,-1],[0,0,-1],[0,-0,1],[-0,0,1],[0,-1,0],[0,-1,0],[1,0,-0],[1,-0,0],[0,1,0],[0,1,0],[-1,0,0],[-1,-0,-0]]}],"mat":[{"r":150,"g":225,"b":219},{"r":150,"g":162,"b":223}]

While this is just a concept and we are still waiting on the technology to make the possible. It is interesting to think of how we may be able to use JSON - 3D in the near future.

You can see a demo of Canvas 3D by nihilogic.dk using JSON for models here (this works on Firefox, but may not work on other browsers).

Accessing JSON Web Services with the Google Web Toolkit

Over at GWT Site they have written a good post about using the Google Web Toolkit with JSON Web Services. Since JSON is fast becoming a standard for web services that are cross domain and GWT is a heavily used development tool this is a useful post.

below is an excerpt from the post.

The main difficulty when trying to talk to some web service on another server is getting past your web browser’s Same-Origin Policy. This basically says that you may only make calls to the same domain as the page you are on. This is good for security reasons, but inconvenient for you as a developer as it eliminates the use of GWT’s HTTP library functions to achieve what we want to do. One way to get around this is to call a web service through a javascript <script> tag which bypasses this problem. In his book, Google Web Toolkit Applications, Ryan Dewsbury actually explains this technique in more detail and provides a class called JSONRequest which handles all the hard work for us. JSON is one of the more popular data formats, so most web services support it. Lets leverage Ryan’s code and take a quick look at how it works.

public class JSONRequest {
  public static void get(String url, JSONRequestHandler handler) {
    String callbackName = "JSONCallback"+handler.hashCode();
    get( url+callbackName, callbackName, handler );
  }	
  public static void get(String url, String callbackName, JSONRequestHandler handler ) {
    createCallbackFunction( handler, callbackName );
    addScript(url);
  }
  public static native void addScript(String url) /*-{
    var scr = document.createElement("script");
    scr.setAttribute("language", "JavaScript");
    scr.setAttribute("src", url);
    document.getElementsByTagName("body")[0].appendChild(scr);
  }-*/;
  private native static void createCallbackFunction( JSONRequestHandler obj, String callbackName)/*-{
    tmpcallback = function(j) {
      obj.@com.gwtsite.client.util.JSONRequestHandler::onRequestComplete(Lcom/google/gwt/core/client/JavaScriptObject;)(j);
    };
    eval( "window." + callbackName + "=tmpcallback" );
  }-*/;
}

To make our request we call the get method with the web service url, and an implementation of the JSONRequestHandler interface. This interface has one method called onRequestComplete(String json). This is where you’ll handle the JSON formatted data once it comes back from the server. When calling a service from within a script tag, we need to specify the name of a callback function in the request. Most services let you specify the name yourself, so the first get method generates a callback name for you. The createCallback method is a JSNI method that simply calls your JSONRequestHandler implementation when the call returns via the callback name. Note, if you use this class, to make sure and change the package name for the JSONRequestHandler call to the correct location. Finally, the get method will call the addScript function which is responsible for embedding the <script> tag on your page and setting its src attribute to the web service url.

You can read the full post here.

Since I am a fan of both JSON and GWT I enjoy seeing good posts about using these two technologies. I recommend this post for any Java developer that wants to make Ajax applications using Web Services.

The Future of JSON

Tagged:  

I am a huge supporter of JSON as a means of communicating with the server side in an Ajax application (you can use JSON as a means of communication on all tiers of an application using a library, but I'm going to mainly focus on the client side for this post). I was looking into the future of JSON and found two great posts by John Resig.

The first is about the need for native JSON support.

The post goes into a lot of detail about the performance benefits of JSON and has some great code samples.

Below is the summery excerpt from the post.

The current, recommended, implementation of JSON parsing and serialization is harmful and slow. Additionally, upcoming standards imply that a native JSON (de-)serializer already exists. Therefore, browsers should be seriously looking at defining a standard for native JSON support, and upon completion implement it quickly and broadly.

To get the ball rolling, I recommend that you vote up the Mozilla ticket on the implementation, to try and get some critical eyes looking at this feature, making sure that it's completely examined and thought through; and included in a browser as soon as possible.

You can read the full post here.

The second post is about the current state of JSON and goes into detail about ECMAScript proposals for the API.

Below is an excerpt from the post about the Mozilla implementation of native JSON.

Mozilla Implements Native JSON - Mozilla was the first to implement native JSON support within it's browser. Note, however, that this is not a web-page-accessible API but an API that's usable from within the browser (and by extensions) itself. This was the first step needed to implement the API for further use.

Here is an example of it in use (works within an extension, for example):

var nativeJSON = Components.classes["@mozilla.org/dom/json;1"]
    .createInstance(Components.interfaces.nsIJSON);
nativeJSON.encode({name: "John", location: "Boston"});
// => '{"name":"John","location":"Boston"}'
nativeJSON.decode('{"name":"John","location":"Boston"}');
// => {name: "John", location: "Boston"} 

You can read the full post here.

Hopefully the future will have native support for JSON. As usual John does a great job on the post and if you don't have John's blog on your feed reader I recommend adding it (he is a very good Ajax development reference).

Syndicate content