Hi all, Just ran across this article blog.taragana.com/index.php/archive/how-not-to-use-ajax and was curious what the consensus is on the bookmarking and search indexing problems. I remember seeing something about bookmarking on the list. Thanks, Ben -- Ben Jackson Diretor de Desenvolvimento INCOMUM Design & Conceito ben-p14LI7ZcAE/pVLaUnt/cCQC/G2K4zDHf@public.gmane.org incomumdesign.com
He took a horrible example of an Ajax site for one... Most Rails sites that use Ajax, use it to enhance the page. Make a new todo item, and it adds itself to the list. The page still has a unique URL, the content is still there to be indexed by Google, so there are no problems. Surf the site in Lynx or with stylesheets/javascript turned off and you''ll see what Google ''sees.'' Google should have no problems indexing a Backpack page, even if Backpack needs a modern browser to work. On 5/9/05, Ben Jackson <ben-p14LI7ZcAE/pVLaUnt/cCQC/G2K4zDHf@public.gmane.org> wrote:> Hi all, > > Just ran across this article > > blog.taragana.com/index.php/archive/how-not-to-use-ajax > > and was curious what the consensus is on the bookmarking and search > indexing problems. I remember seeing something about bookmarking on the > list. Thanks, > > Ben > > -- > Ben Jackson > Diretor de Desenvolvimento > INCOMUM Design & Conceito > > ben-p14LI7ZcAE/pVLaUnt/cCQC/G2K4zDHf@public.gmane.org > incomumdesign.com > > _______________________________________________ > Rails mailing list > Rails-1W37MKcQCpIf0INCOvqR/iCwEArCW2h5@public.gmane.org > lists.rubyonrails.org/mailman/listinfo/rails >-- rick techno-weenie.net
Ben Jackson wrote:> Hi all, > > Just ran across this article > > blog.taragana.com/index.php/archive/how-not-to-use-ajax > > and was curious what the consensus is on the bookmarking and search > indexing problems. I remember seeing something about bookmarking on the > list. Thanks,Like many things, the appropriate use of JavaScript + DHTML depends on the circumstances. The example page is doing exactly what the author says is an OK use of remote scripting: dynamic page content updates. The problem may be that the area being updated exceeds some unspecific size or content percentage. Does it mess with Web spiders? Sure. Does it break the user''s sense of the back button? Maybe. But whether or not these and other issues are sufficient reason to avoid such a technique depend on the goal of the page. What matters is, Does the page provide the most appropriate user experience? The techniques used are secondary. If I were to fault the example site, it would be for failing to use JavaScript to cache the content, which would eliminate round trips each and every time a menu item was selected. I''m leery of arguments against doing something because it breaks expectations acquired while using rapidly-aging technology. It''s sort of like saying computers should not have acquired rich GUIs because it would break the paradigm of monochromatic, text-based computer interaction. BTW, it''s amusing that the author, who chides people for practices that break user expectations based on browser conventions, does so himself by not having his links underlined and blue. James
> Like many things, the appropriate use of JavaScript + DHTML depends on > the circumstances. The example page is doing exactly what the author > says is an OK use of remote scripting: dynamic page content updates. > > The problem may be that the area being updated exceeds some unspecific > size or content percentage. >I don''t understand... the site didn''t seem to exhibit any weird layout behavior.> Does it mess with Web spiders? Sure.There are already techniques used in Flash development to dump xhtml content to spiders, and if I''m not mistaken it''s already been established that this will not get you banned for spamming.> Does it break the user''s sense of the back button? Maybe. But whether > or not these and other issues are sufficient reason to avoid such a > technique depend on the goal of the page. What matters is, Does the > page provide the most appropriate user experience? The techniques > used are secondary.Anyone thought about implementing an application controller with a history? dojotoolkit.org/intro_to_dojo_io.html. Now you can add undo and redo commands to you app and hook them into the back button. This library also solves the bookmarking issue, by the way. -- Ben Jackson Diretor de Desenvolvimento INCOMUM Design & Conceito ben-p14LI7ZcAE/pVLaUnt/cCQC/G2K4zDHf@public.gmane.org incomumdesign.com
Ben Jackson wrote:> >> Like many things, the appropriate use of JavaScript + DHTML depends on >> the circumstances. The example page is doing exactly what the author >> says is an OK use of remote scripting: dynamic page content updates. >> >> The problem may be that the area being updated exceeds some unspecific >> size or content percentage. >> > I don''t understand... the site didn''t seem to exhibit any weird layout > behavior.My take on the writer''s criticism is that the "bad" DHTML site was using dynamic content replacement as a substitute for full-page replacement. The amount of dynamically altered content was almost the entire page; a link to a new page would perhaps better fit what the writer prefers (e.g., it allows the use of the back button, pleases spiders, and so on). But the update was not really the entire page, just some subsection. And had that subsection been smaller (say, a single box to one side), then the writer would likely have had no issue with the behavior. So, when the writer says that it''s OK to use DHTML for dynamic page content updates, there''s a hidden caveat of "but not too much." The suggestion is that, at some point, altering a page become replacing the page.> >> Does it mess with Web spiders? Sure. > > > There are already techniques used in Flash development to dump xhtml > content to spiders, and if I''m not mistaken it''s already been > established that this will not get you banned for spamming.Of course, and these are among the things to consider when picking an implementation technique. Sometimes they matter, sometime they don''t. James
I just did a blog entry on ajax, bookmarking and the back button .. there is an ajax library out there that tries to resolve a lot of these issues. Its called the Dojo toolkit. google it or visit my blog to read a little more about it... getintothis.com/blog/2005/05/09/the-dojo-toolkit-using-browser And that sample site they show IS a horrible use of Ajax .. all the content on the page is changing as you click on each link, so whats the point of using ajax to begin with? Just use regular links to new pages... =P On 5/10/05, James Britt <james.britt-Re5JQEeQqe8AvxtiuMwx3w@public.gmane.org> wrote:> > Ben Jackson wrote: > > > >> Like many things, the appropriate use of JavaScript + DHTML depends on > >> the circumstances. The example page is doing exactly what the author > >> says is an OK use of remote scripting: dynamic page content updates. > >> > >> The problem may be that the area being updated exceeds some unspecific > >> size or content percentage. > >> > > I don''t understand... the site didn''t seem to exhibit any weird layout > > behavior. > > My take on the writer''s criticism is that the "bad" DHTML site was using > dynamic content replacement as a substitute for full-page replacement. > > The amount of dynamically altered content was almost the entire page; a > link to a new page would perhaps better fit what the writer prefers > (e.g., it allows the use of the back button, pleases spiders, and so on). > > But the update was not really the entire page, just some subsection. > And had that subsection been smaller (say, a single box to one side), > then the writer would likely have had no issue with the behavior. > > So, when the writer says that it''s OK to use DHTML for dynamic page > content updates, there''s a hidden caveat of "but not too much." > > The suggestion is that, at some point, altering a page become replacing > the page. > > > > > >> Does it mess with Web spiders? Sure. > > > > > > There are already techniques used in Flash development to dump xhtml > > content to spiders, and if I''m not mistaken it''s already been > > established that this will not get you banned for spamming. > > Of course, and these are among the things to consider when picking an > implementation technique. Sometimes they matter, sometime they don''t. > > > James > > _______________________________________________ > Rails mailing list > Rails-1W37MKcQCpIf0INCOvqR/iCwEArCW2h5@public.gmane.org > lists.rubyonrails.org/mailman/listinfo/rails >-- - Ramin getintothis.com/blog _______________________________________________ Rails mailing list Rails-1W37MKcQCpIf0INCOvqR/iCwEArCW2h5@public.gmane.org lists.rubyonrails.org/mailman/listinfo/rails
Hi, I think many people don''t get the fact that Ajax is a solution for a problem not connected to most "public" web pages. It''s a solution for web applications (with most of them never used outside their company), to be more responsive and allow UIs like those back in the "good old" client/server days. If you use it on a public web site, you have to be careful of course, considering accessibility and usability, and of course search engines. The example in the article really shows "how not to use" it, because it tries to fix a problem that doesn''t exist. -- Thomas Am 10.05.2005 um 02:54 schrieb Ben Jackson:> Hi all, > > Just ran across this article > > blog.taragana.com/index.php/archive/how-not-to-use-ajax > > and was curious what the consensus is on the bookmarking and search > indexing problems. I remember seeing something about bookmarking on > the list. Thanks, > > Ben > > -- > Ben Jackson > Diretor de Desenvolvimento > INCOMUM Design & Conceito > > ben-p14LI7ZcAE/pVLaUnt/cCQC/G2K4zDHf@public.gmane.org > incomumdesign.com > > _______________________________________________ > Rails mailing list > Rails-1W37MKcQCpIf0INCOvqR/iCwEArCW2h5@public.gmane.org > lists.rubyonrails.org/mailman/listinfo/rails >