Posted by
Nadeem Jafar
|
Comments:
Posted in
smo
Furthermore, even if a search engine could figure out how to interpret a Flash file or AJAX application
adequately, parsing and indexing its pertinent content, there would be no way to navigate to that particular
part of the application using a URL. Therefore, because the primary goal of a search engine is
to provide relevant results to a user, a search engine will be hesitant to rank content in those media
well. Lastly, both Flash and AJAX would invite several more innovative and harder-to-detect forms
of spam.
The Blended Approach
But before you assume that we vilify Flash and AJAX completely, there is somewhat of a solution. A site
designer should only use Flash and AJAX for the areas of the site that require it. This is called the blended
approach. He or she should design an HTML-based site, and employ Flash and AJAX technologies where
they will provide a tangible benefit to the user. He or she should attempt to keep as much of the textual
content HTML-based as possible.
Frequently, a mix of HTML and JavaScript (DHTML) can also approximate most of the interactivity of
these technologies. For example, clicking a button could hide or unhide an HTML div element. This will
involve employing the use of smaller Flash or AJAX elements placed inside a traditional HTML layout.
In other words, you should use Flash and AJAX as elements on a page, not as the page itself.
Some SEM authorities also recommend providing a non-Flash or AJAX version of content using
or <noscript>, respectively. Unfortunately, because those tags are invisible (and have</span><br><span style="font-style: italic;">been used so pervasively for spam), their efficacy is questionable. Search engines may choose to ignore</span><br><span style="font-style: italic;">the content therein completely. They may, however, enhance usability for users with disabilities, so it is</span><br><span style="font-style: italic;">not unwise to employ them for that purpose.</span><br><span style="font-style: italic;">This solution also misses the mark for another reason — a typical Flash or AJAX site exists on a single</span><br><span style="font-style: italic;">“page,” therefore further limiting the utility of the tag, because all content would presumably have to</span><br><span style="font-style: italic;">exist on that one page!</span><br><span style="font-style: italic;">Figure 6-15 shows an image of a site that looks like a full Flash application, but was changed to HTML</span><br><span style="font-style: italic;">with DHTML and hidden layers. The presented link is http://www.xactcommunication.com/</span><br><span style="font-style: italic;">WristLinx-9/X33XIF-WristLinx-TwoWay-Wristwatch-Radio-35.html</span><br>
Posted by
Nadeem Jafar
|
Comments:
Posted in
jknmkh
,
mnhgj
,
njhggf
,
seo mnho
Frames
There have been so many problems with frames since their inception that it bewilders us as to why anyone
would use them at all. Search engines have a lot of trouble spidering frames-based sites. A search
engine cannot index a frames page within the context of its other associated frames. Only individual
pages can be indexed. Even when individual pages are successfully indexed, because another frame is
often used in tandem for navigation, a user may be sent to a bewildering page that contains no navigation.
There is a workaround for that issue (similar to the popup navigation solution), but it creates still
other problems. The noframes tag also attempts to address the problem, but it is an invisible on-page
factor and mercilessly abused by spammers. Any site that uses frames is at such a disadvantage that
we must simply recommend not using them at all.
Jacob Nielsen predicted these problems in 1996, and recommended not to use them at the same date.
Now, more than ten years later, there is still no reason to use them, and, unlike the also relatively
benign problems associated with tables, there is no easy fix. See http://www.useit.com/alertbox/
9612.html.
Using Forms
A search engine spider will never submit a form. This means that any content that is behind form navigation
will not be visible to a spider. There is simply no way to make a spider fill out a form; unless the
form were to consist of only pull-downs, radios, and checkboxes — where the domain is defined by permutations
of preset values, it could not know what combinations it submits regardless. This is not done
in practice, however.
There are some reports that Google, in particular, does index content behind very simple forms.
Forms that consist of one pull-down that directs the user to a particular web page are in this
category. However, as with the example of JavaScript links being spidered, we do not recommend
depending on this behavior. As a corollary, if such a form points to content that should be excluded,
it may be wise to exclude the content with an explicit exclusion mechanism, such as robots.txt
or the robots meta tag!
There is no magic solution for this problem. However, there is a workaround. As long as your script
is configured to accept the parameters from a GET request, you can place the URLs of certain form
requests in a sitemap or elsewhere in a site.
So if a form submits its values and creates a dynamic URL like the following:
/search.php?category_id=1&color=red
that same link could be placed on a sitemap and a spider could follow it.