Wordpress

New Design Coming Soon

Please continue to visit during the coming weeks.  We will be performing a complete site redesign and would love some feedback on ideas to critiquing.  We will be looking forward to your responses and thoughts on what directions you think would go well with our content and its ability to enrich the Web Design and SEO communities of both Naperville and Chicago.hidden links
Read more...

6 Great WordPress Image Plugins

Here's a List of 6 of my favorite Wordpress Plugins for Images. Obviously there are more functional jQuery Plugins , but this is a good list of basics.hidden links Fancybox For Wordpress- Seamlessly integrates FancyBox into your blog: Upload, activate, and you're done. No further configuration needed. However, you can customize it from the Options Page if you like. Featured Content Gallery - Featured Content Gallery is a plug-in for Wordpress which lets you to pick a few posts from a specific category or via post IDs and then display them in a slide show. The slide show is very cool which displays an image in background and description of post in a fading gray box in bottom of this widget. featured-content-gallery-wordpress-plugin Thumbnails For Post - Thumbnail For Excerpts search the post for the first image. If exists, than it will search for the thumbnail created by default by WordPress for the image, if it was uploaded from WP administration area. If not, it will show the image itself, but of course, scaled (IMPORTANT: since version 1.2, there is an option to let the plugin to automatically generate the thumbnail where it do no exists.) SEO Friendly Images – Images can be a great source of traffic as people search for images of various subjects, and this plugin helps you with making sure that you have “alt” and “title” tags on all of your images so that the search engines can properly index them. Lightbox 2 – Plugin used to overlay images on the current page into neat Javascript-powered overlay pop-ups. Flickr Photo Album – This Flickr plugin for WordPress will allow you to pull in your Flickr photosets and display them as albums on your WordPress site.
Read more...

Good Site Structure

I am grateful to offer this article for repost.  While posted originally at webdesign.org three years ago, it offers the best reason to look deeper into the site structure you;d wish developed in advance of the design. Those in Naperville and Chicago should take notice to these suggestions.  Thinking of them in advance of a web design can save a lot of money and prevent buyers remorse when the site reaches completion.hidden links Ever visited a site and wondered "what am I supposed to do?" Ever got lost in someone else's tangled web?(if not, go to Aurora,Il homepage) Often, people will create a site loaded with information, and present it in such a way that you have to work the site out before you can do anything with it. While it's necessary to organise your information into a logical structure, remember that people find it hard to comprehend anything but the most simple and obvious structures. You may say "Well, it's pretty obvious isn't it", but that's only because you know your site so well. Often, I've wandered aimlessly through sites, not knowing how many levels there are to it or how much information there is "out there somewhere" on the site, and without any idea of what I've missed or whether I "took the right turn". In this example, I've created a hierarchy that organizes a site with recipes, photos and stories. Each of these options is very commonly used on the web, and with any option, the items are all accessible, but what's the best way to organize those items?
1.1
One-Tier Site: All the information is on one page.
Two-Tier Site:  On the main page, there is a link to each of the six items.
Two-Tier Site: On the main page, there is a link to each of the six items.
Three-Tier Site:  On the main page, there is a link to a page for each category.  These pages may have thumbnails or brief descriptions on them, and have links to the items in that category.
Three-Tier Site: On the main page, there is a link to a page for each category. These pages may have thumbnails or brief descriptions on them, and have links to the items in that category.
The advantage of a one-tier site is that there are no internal links, and there is no navigation. Everything is right there on that page. This is perfect for sites that have very little information on them, but if you have a site that's not minuscule, this would just create a totally disorganized "Wall of Text". The two-tier site allows for a fair bit more information, but if you have a large site, it will confront the visitor with an unsightly "Wall of Links". This site is a three-tier site. I chose that format when I created the site, as it works well. I have a main page, and then you can choose tutorials, articles, resources etc. and then you choose the specific tutorial, article or resource you want. The only problem is that you can't see the third tier from the first page. I sometimes can't remember whether something's a tutorial or an article, and visitors have to choose whether they want a tutorial or a resource without knowing anything about what they will find. When I get around to redesigning this site, I'll to organise it with a Top-Heavy combined three-tier system, explained below. Bottom-Heavy System: 5 This system combines the second and third tiers, with a link to each category on the main page, and multiple items (e.g. stories) on one page. This is perfect for if you have lots of categories, but with not much content in each category. If I tried to set Pegaweb up like this, every tutorial would be on a single "Tutorial Page". It would be about 800k, and about ten miles high. :) Top-Heavy System: 4 As long as this doesn't create mass clutter on your main page, this option has all the advantages of two and three tier systems. The links to each item are on the main page, but are organised into tiers on that page. There will be a fair few links on the main page, but the organisation of the site will be very easy to understand. This is the system I intend to use when I redesign this site. On the main page, there will be tiers, with links to each individual tutorial, article or resource etc. If you look back up the page, you'll find that this system is very similar to the two-tier system - in effect, it IS a two-tier system. The moral of the story is - design your websites to be two-tiered. Choose the bottom-heavy or top-heavy systems as necessary. Unless your site is gigantic, it's possible to keep your site two-tiered, especially by using the bottom-heavy system and creating/removing categories, to balance the clutter between your main page and the rest of the site. This is a reposting of a 2006 article by webdesign.org.  The value of this article is still very much a topic of relevance for web design in the Chicago and Naperville areas.
Read more...

Web Design and Color Importance

While adjusting for the thoughts for our new design, I have been thinking a lot about what colors will be used.  Every site will inspire a response with not just the design, but the colors used within the framework of that design.   Now that our web design service is fully available tot the Naperville community, we want to have our site brought to it's peak performance(the new unveiling will be 1 month from today). Designing web-sites involves numerous skilled disciplines from type to layout & color. Color is particularly prominent as it provides the first impression to the user. The correct colors can create a good user experience, while incorrect colors can have a bad impact. To create a good website, the website designer needs to know what affect colors can have on people. People subconsciously react to colors & associate them with different emotions and feelings. Colors don't just stir up emotions & feelings that might influence how a site is seen but they can also be cleverly used to direct users to specific sections of your site. color-wheel-300 Every single color that you can think of can be used on the internet these days, which means that picking the right colors can be a mammoth task. Here is a swift summary of how some colors can provoke certain reactions. Green is linked with nature, peace and jealousy. It is also a truly relaxing color and is perfect to use for a relaxing effect. The color white stirs up feelings of purity, simplicity, emptiness and innocence. If used as the main color of a site, it creates a clean and simple feel. Blue is most commonly associated with business sites as it's a strong color that's associated with confidence, coldness, depression, water and peace. The color blue is linked with confidence, loyalty and coolness. It's the best-known color in the world and it's used by many companies to create a feeling of strength & confidence( plus, blue and orange seem to be the Naperville and Chicago favored colors). Black is linked to feelings of mystery and refinement. For more detail go to: www.instant-video-streamer.com. An extremely popular color in design and photo web sites, it can be used effectively to contrast and liven up other colors. Green is linked to organic, nature and relaxation. The paler end of the green spectrum can be used to give a site a relaxed feel. Grey can be associated with respect, humility, decay and boredom. It's used a lot to form shiny gradients in website design to give a professional, ordinary feel to a site. Orange is strongly associated with spirituality and healing. It's the color that symbolizes Buddhism and it has a calming energy about it. It's a bold color that is not as lively as yellow but not as deep as red. Darker shades of purple can be very deep and luscious. It is linked to royalty, spirituality, arrogance and luxury. Lighter shades can represent romance and delicacy. It's a color that's not really used much on sites. Full of energy, vibrancy and stimulation, orange is a fantastic color to use in designing web-sites. It is used to bring youthfulness to a design. Color's role is not just to make a website look good; it can encourage feelings & emotions from the audience.  In the Chicago and Naperville areas, this can be especially important because of how emotionally driven local customers can be. Choosing colors that annoy the end user can have damaging effects on your website, while cleverly selecting can mean that the website meets user expectation.
Read more...

SEO Through Google Sitelinks

An excerpt from SEOpedia.org. hidden links Here they explain how and why it's beneficial to add sitelinks to your sites Google index. Please feel free to read the original article here. -enjoy- by Cristian Mezei I have been testing these Sitelinks for quite some time now on several websites, from new to very old, from few to zounds of visitors. What are Sitelinks ? They are a collection of links, automatically chosen by Google’s algorithm, to appear below the result of website, linking to main pages of your website. They are randomly chosen, although you can block any link from appearing. We will discuss more about Sitelinks in the Google Sitelinks FAQ section below. Recently, some of my websites got Sitelinks whilst I tried different ways of reaching this milestone. subdomain-sitelinks Some time ago, Vanessa Fox, from the Webmaster’s Central blog, wrote that the page from the Google Help describing these Sitelinks, has been updated to reflect “information on how Google generates these links”. That’s crap to say the least, because that Google Help page about Sitelinks, just states that they exist, are automatically generated and nothing more. Although no official explanation except this very basic page is offered by Google, I will try and write down a few of my own ideas, about when and how to get these Sitelinks for your website. Whilst I can’t promise you guys that ALL of the procedures below are involved in the process of making Sitelinks appear for your website, I can definitely guarantee you that SOME are. The above are true mainly because I have always (during months / years) tried 4 to 6 procedures at a time so I can’t really know which one had the most important contribution to the appearance of Sitelinks.

Procedures which may be involved in the appearance of Sitelinks:

  1. The number of links pointing to your website’s index page, using the several main keywords of your website as anchor. For example, for my blog, the two main keywords are “Cristian Mezei”, my name, and “SeoPedia” the name of my blog. Sitelinks appear only for a few main keywords, not for every keyword your website ranks for.
  2. The number of searches and SERP clicks for the main keywords I described above. you have to have a certain number of clicks for that keyword, to be able to reach a minimum requirement for the appearance of Sitelinks. This makes keywords which are not searched enough, to never have Sitelinks. Although some of my coleagues have mentioned that traffic has nothing to do or has everything to do with Sitelinks, I firmly believe that traffic for a particular keyword or keyphrase is very important.
  3. The number of indexed pages for the keyword you are targeting is also important. Please keep in mind that I am not discussing about the number of indexed pages for your website, but for the number of results shown in Google for that particular keyword.
  4. The age of the website is definitely an aspect when deciding how and when Sitelinks appear. As far as my tests go, and using a naturally and organically built website (no extensive or forced SEO), you can NOT have Sitelinks if the website is younger than 18-24 months, varying from case to case.
  5. You have to rank #1 for that particular keyword (and the ranking has to be stable) to be able to have any Sitelinks at all. This is very important and it has been proven true in 100% of times.

Misleading advices about Google Sitelinks

Whilst many other specialists and/or bloggers from the industry around the Internet have tried to help you figure out some ways to get Sitelinks, I will try to contradict them because some of those advices might not have a contribution to your effort, mainly because they are just too general and my experience says that they could be just loose-ends. Some of these advices might be:
  • Making your website W3C valid. This is not a bad thing, but I highly doubt that it will make your website more prone to get Sitelinks. A lot of people have reported building their website with erratic code from 1992, and still having Sitelinks.
  • Having links from powerful websites. I doubt that this aspect will help you in getting Sitelinks at all. Have a look at how I see inbound links having an effect, above (in the Procedures section).
  • Having a lot of links (generally). I doubt that having tens of thousands of any links will move you up to the ladder, regarding Sitelinks. Whilst links will help I have explained above (in the Procedures section), specifically, in what way they will help.
  • Some advices were really something like : “Make the website useful” or “Add Meta tags”. Whilst these are surely helpful for any website, they may have nothing to do with your website getting Sitelinks.
  • Having a very well designed navigation menu. There were websites which had erratic or very well designed navigation menus and links within the website and still they all got Sitelinks.
  • Pagerank has nothing to do with Sitelinks. There are PR7 and PR2 websites that got Sitelinks.
Although I don’t want to contradict (I just did that, but well .. ) my fellow colleagues, the above are my personal opinions and I wanted to stress them out. The reason I didn’t named names is obvious. And as the title of my post says, below you’ll get the FAQ section, where I tried to answer most, if not all the questions that poped up in the past year, from all kinds of readers or people:

Google Sitelinks: The FAQ

Q: When are Sitelinks generated ? Is there some kind of Pagerank-alike update ? A: I do want to stress out that about 4 of my websites got Sitelinks in exactly the same 1-2 day period, although the websites are very different one from another. One is 2 years old, another is 3,5. One has 1000 links, the other has 40.000 links. One is in the auto domain one is my blog. They are not linked in-between them. So all of this makes me think that there is some kind of general update of the Sitelinks, much like the updates for Pagerank, Inbound links or Google Images. Since QOT got their Sitelinks on exactly the same day (6th Feb.) as many of my other websites, I am positive that there is a general Sitelinks update. Q: I can’t see any Sitelinks generated within my Sitemaps account, although they appear in Google! A: Sitelinks take anywhere from 2 weeks to 1 month to appear within your Sitemaps account, after they first appeared in the SERPs. Then you will have better control over some of the links. Q: Why doesn’t my very important “Clients” page get in the Sitelinks section ? A: This may have to do with the fact that Sitelinks are usually generated from the first level links only. This means that if you have a page reachable by two clicks, it will never be included in the Sitelinks section. On rare occasions, deeplinks will be chosen, but I am not sure as to how these websites are chosen. Also make sure that you have pure HTML links. No Javascript or Flash. Q: My website doesn’t have too much text links. Does this mean I’m doomed ? A: Google will generate Sitelinks from image links too, as long as the image has the ALT tag. As other people have found too, it seems that the Sitelinks algorithm may chose a Sitelink even if you have no link towards it from your website, but in exchange, the page has a large number of links from other websites. Q: What’s the point of having these stupid Sitelinks ? A: One simple and huge reason: Trust and brand. Sitelinks have began to resemble trust lately in the eyes of the normal surfer (not to us SEMs, simply because we know there are heavily penalized websites who still got Sitelinks), so any website who has them is more prone to get clicks from the SERPs, from the search terms that show Sitelinks. Q: What’s the minimum and maximum number of Sitelinks I’ll get ? A: Minimum 2, maximum 8. Nevertheless I still can’t figure it out how Google assigns the number of Sitelinks to each website, except popularity. Most of my popular websites have 8. Most of my not-so-popular websites have 2 to 4. Q: I don’t have a Google Sitemaps account. Will I still get Sitelinks ? A: Definitely. The only drawback is that you will not have any control over them. Q: How are the Sitelinks calculated ? Which links get in and which not ? A: There are all kinds of opinions. After closely studying all my websites, I myself will still believe that they are chosen randomly. Not after traffic, not after inbound links. There’s an interesting thread at SEW which you might want to read to get some speculation. Q: I have a page in the Sitelinks section that doesn’t exist anymore. What should I do ? A: It appears that the crawl delay of the Sitelinks is at least one month. So if you have a page that doesn’t exist anymore, try to 301 redirect it to the new one. The Sitelinks will then work ok. Q: In my Sitemaps account I can remove Sitelinks if I don’t like them ? A: Indeed you can. But please be careful when you do that, because if you remove a Sitelink it will not get replaced by another. This means that if you had 6 Sitelinks, and you block one because it’s not appropiate, you will be left with 5 Sitelinks in the Google SERPs. The 6th one will not be replaced with a new Sitelinks.

Vanessa Fox Nude forgotten all important post

The title is just a teaser for Vanessa. She’s had that Nude thing like forever :) For you guys who don’t know Vanessa, she’s been the women who lead the Google Webmasters Central team until she moved to Zillow. In this section I’ll analyze the post she made on her blog right after she left Google. I’m actually amazed to see how I can’t any reactions to this post, since IMHO it’s the most important post about Sitelinks ever. More important than what Google has released and certainly more important then I or my colleagues speculate, simply because she’s been involved in the process of releasing the Sitelinks. Block quotes are quotes from Vanessa’s post:
For instance, if I do a Google search for [duke’s chower house seattle], am I looking for directions? Hours? A menu? Google doesn’t know, so they offer up several suggestions. (Quality aside: a link to the menu shows up in the sitelinks, but if you do a search for [duke’s chowder house seattle menu], that same link doesn’t show up on the first page. In fact, no pages from the Duke’s site show up.)
Basically, what Vanessa is telling us is that Sitelinks will NEVER appear for specific search terms. So that’s why we get Sitelinks for “Computers” or “Cristian Mezei” or “HP” or generally, company names as well as very general industry terms.
Google autogenerates the list of sitelinks at least in part from internal links from the home page. You’ll notice in the Duke’s example that one of the sitelinks is “five great locations” which also appears as primary navigation on the Duke’s home page. If you want to influence the sitelinks that appear for your site, make sure that your home page includes the links you want and that those links are easy to crawl (in HTML rather than Flash or Javascript, for instance) and have short anchor text that’ll fit in a sitelinks listing. They’ll also have to be relevant links. You can’t just put your Buy Cheap Viagra now link on the home page of your elementary school site and hope for the best.
In the above, Vanessa confirms me what I already told you in the FAQ section above. Sitelinks will be chosen from links present in the homepage only. I still firmly believe that some websites have Sitelinks from deeplinks within the website. How and when these websites are chosen, is still a mystery. One more important thing we learn is that Sitelinks are chosen from relevant links in the homepage. Instead of repeating what Vanessa said about relevance, read the above quote. There is a lot of other useful information inside Vanessa’s post, but since I already tackled those points in my previous sections, I left them aside.

Other opinions about Sitelinks

I asked a colleague of mine involved in SEM too, what he thinks about Sitemaps. I thought to put his answer here as well:
Marius Mailat www.submitsuite.com
Cristian asked me about my opinion regarding Sitelinks. Breaking this question in small parts, here are my thoughts. The sitelink option in the Google results are similar with the siteinfo.xml provided for the Alexa toolbar, a simple option for a webmaster to provide most important direct links to his website structure. Google version of Siteinfo is different because you cannot specify WHICH link in your website is a Sitelink. You can only ask remove one link from the Sitelinks (Google Webmaster panel option). Why are the Sitelinks appearing, when and under which algorithm? The algorithm used is totally automated and is taking in consideration the following criteria’s:
  • Old powerful website.
  • The sitelinks are pages which are coming on first position in SERPs.
  • The sitelinks are most of the time associated with top results related words: “domain”, “domain download”, “domain demo” etc.
  • The sitelinks are probably not influenced by PageRank.

Other very useful locations on the web for Sitelinks

Have fun with Sitelinks. If you have any questions, suggestions or rectifications, write a comment.
Submit your website to Webxperience! and Webotopia directories and get bonus deeplinks.
Read more...

Cleaning UP The Dirty Domain

Cleaning Things Up!!hidden links Fixing the Link.555 So now that you know this “method” of SEO is archaic, ineffective and sloppy, how do you go about fixing your site? Whether or not you should “fix something that isn’t broken” is something only you (if you have the knowledge), your in-house SEO or outside SEO firm can really answer as it really does need to be looked at on a case by case basis. There IS POTENTIAL RISK INVOLVED with changing URL structure that should be assessed. That said, I’ve had a lot of success with site structure migrations and most times, will choose to slowly migrate the site to a new, sensible URL structure. Some tips for site structure migration If you do choose to change your URL structure, you’ll find some tips based on my experience with my previous migrations below:
  • Prepare for the fact that it takes Google a bit to “figure things out”. The more often you get crawled, the less time it will likely take for Google to get with the program. I’ve seen URL migrations take anywhere from a few days to a few weeks to get sorted out.
  • Always choose a small, “mid-level” traffic section to start with. This way, you can see the results before enacting things on a larger scale and/or with your most important keywords.
  • Make sure you have a good 301 plan in place for pointing the old URLs at the new ones. Without it, your new URLs will not “take the place” and the authority, link popularity and the rankings of your old URLs.
  • Change site navigation, but use your internal sitemap as a “reminder”. What I mean by this is that I usually change the links to the new links throughout the site naviagtion and within the site content, but I leave a link pointing to the old URL on the internal sitemap (not the one you feed to Google via WMC) until I’m sure Google has seen the redirect and removed the old link from the index.
  • Wait a few days, watch for the new URLs to be indexed, the old URLs to be removed and wait to ensure the new URLs take over the rankings formerly held by the old URLs.
  • If everything goes smoothly then I wash, rinse and repeat with other sections. Take things slow. It takes a while, but if anything goes wrong, you want it to go wrong with one small piece and not your entire site.
In addition to the above tips, it never hurts to go out and get some quality links to the new URLs from tightly themed sites to let the engines know that your new URLs are just as relevant as your old ones and that their still important to crawl regularly (and hopefully speed up the migration process).
Read more...

Dirty Domains Damage SEO

Changed Your Wordpress Permalinking Yet?wordpress-permalinks You May Wish To.hidden links Toolbar pagerank used to be all the rage and most optimizers were all about creating those green pixels and maintaining it through any means necessary. One of the things optimizers noticed early on was that Pagerank seemed to be site based and distributed from “the top down”. What this means is that if your homepage was a low to mid PR5, any pages directly off the root of the homepage would be a PR4. Subfolders directly off the homepage would also be a PR4. But pages within the subfolders would be a PR3.
  • yourdomain.com = PR5 yourdomain.com/your-page.html = PR4 yourdomain.com/your-folder/ = PR4 yourdomain.com/your-folder/this-page.html = PR3 yourdomain.com/your-folder/another-folder/ = PR3 yourdomain.com/your-folder/another-folder/another-page.html = PR2
Back then, the higher your toolbar PR was, the higher the likelihood that your pages would rank (that wasn’t all there was to it - lower PR could beat higher PR, but toolbar PR was a big portion of things in those days). So a lot of optimizers took to creating “root based” sites. Which essentially means that every page of the site was built off of the root.
  • yourdomain.com = PR5 yourdomain.com/your-page.html = PR4 yourdomain.com/your-folder-now-a-page.html = PR4 yourdomain.com/your-folder-now-this-page.html = PR4 yourdomain.com/your-folder-another-folder.html = PR4 yourdomain.com/your-folder-another-folder-now-another-page.html = PR4
Back in the day it made sense. Nowadays, it’s pointless and messy. First things first, Toolbar PR is for entertainment. Most optimizers have known that for a long time. Secondly, PR either didn’t remain or never was “site based” (you can decide that one) and is instead based on the individual page factors (like inbound links). Thirdly, the Google algorithm is no longer a one ingredient wonder. Nowadays, we have trust, age, authority and varying other factors contributing to how a site ranks in addition to Pagerank. In addition to the strategy now being pointless, it’s also messy. Not only do you risk getting crawlers confused with a lack of a logical site structure, but it is also beyond annoying to someone who has to work with a site to have to trudge through a root based site with several hundred or thousand pages. It’s like opening a huge walk in closet to find a tie only to not be able to even walk into it because everything is simply thrown on the floor with no rhyme, order or reason. But it seems a lot of “SEO firms” didn’t get the memo and we are constantly seeing sites with this “SEO method” employed. A response to this posting and corrective changes will be posted momentarily.
Read more...

Congratulations to Naper Design

As of this morning, Naper Design is the #1 ranked site for the search term "Naperville SEO".  In only 1 month and 12 days, Naper Design has shown the ability  to use non-purchased SEO and natural feeds to build site traffic.  Now our goal is shifting, to also add a top 25 listing fr Chicago.  If anyone thinks that sounds like setting a low goal, please feel free to Google "Chicago SEO".  The goal is not only reachable, but we hope to be listed within this ranking by early October.  We will also be focusing on increasing our rankings for the search terms "Chicago Web Design", "Naperville Search Engine Optimization" and "Naperville Web Design".hidden links bragging Please feel free to verify the results.  We were listed #1 this morning.  As you can see, there is no login to Google, so nothing would be effecting the indexing based on preferences. This is further proof that large cost, local SEO groups are charging more for their services than the average business should be paying.  This is in no small part a benefit of the mapping techniques discussed here
Read more...

SEO! Could the experts be wrong?!!!

While stomping the SEO threads today, I came across an interesting question concerning keyword usage with search ranking.  It ask a simple question, what if the "experts" are wrong?   If you've kept up with most of my post, then you realize what my thoughts are of the self-proclaimed experts are.   That being said, there are some points made by this article that are helpful and not as generic as many of the postings on line.   All credit to webdesign.org for this article. -enjoy- hidden links Search engine optimization, or SEO, has become a huge online "industry". It's an accepted fact that if your website ranks higher in the search results for a given keyword then there is a better chance that it will get more visitors. The best part about this is these are considered "organic" visitors, meaning that you don't have to pay for advertising to get them to your site. In the online business world, this translates into huge earnings with very little advertising costs. In other words, the ultimate goal of any online business owner. highres_3199259 So being able to figure out exactly how to get that 100% guarantee that your site will end up on top of the search results is basically the Holy Grail of SEO. There are people out there that dedicate their careers, even their life to figuring this out. Because they know that if they did they would become overnight millionaires. It would be like cracking the code that nobody else in the world could do. Some people may get parts of it figured out, but no one has been able to give that 100% guarantee. The problem is a lack of consistency. If you asked anyone who has a website how they were going to get on top of the Google search results, 99.9% would agree they would need the highest PageRank possible. Now it is no doubt that a high PageRank is important, but is it a guarantee that your site is going to make it on the first page of the results. A lot of people would say yes, and when Google announces that they are going to update the PageRanks most website owners are on pins and needles wondering if they are going to move up or down. But what if it didn't really matter? There is an internet marketer (probably one of the last truly honest ones out there) by the name of Jonathan Leger who has spent a great deal of time testing out the theories that the SEO experts or "gurus" have been selling to website owners for years. One of which is the "PageRank theory". He ran a very intensive case study and came up with some interesting results. He found that up to 1/3 of the time websites with lower PageRank actually ranked higher in the search results than those with higher PageRank. Some time a great deal higher. Consider a website with a PageRank of 4 ending up 5 steps above a website with a PageRank of 7 or 8. It actually happens!! But the problem is that this isn't the only theory that SEO experts have been preaching that doesn't stand up under Mr. Leger's study. He has found 7 (yes, 7!) different SEO theories that don't always hold true in the real world. That's why if you are at all interested in getting your website to the top of the search results then you have to read Jon's report. And of course, he surprises us all again by the fact that he isn't charging a single dime for it! All you have to do is go to his website below and get ready to download the free report. This whole thing could end up making some people very angry, but I have a feeling that the rest of us will be very happy! Original article from webdesign.org
Read more...

Visualizing Your Objective-Part 2

Mapping Your Objective

If you have done a map search for random companies lately, you'll start noticing a trend.  Companies that aren't franchised and have only one physical address are popping up all over the place.  If you do a search for Illinois Web Design, a few names... not mentioning them... will pop up across the entire map of the state, when they are located at only one address.  This is a trick that is being used more and more for SEO and SEM.  I won't take a stance to endorse or denounce this tactic because it does provide results.hidden links mapping While some may say this method is dirty, tainted, or just underweight marketing, it's being used by business everywhere to propel their site recognition and visibility.  The point is not to consider any form of media as to small to be useful.  Whatever addresses are applicable to your business or site should always be listed in Google, Yahoo, and Mapquest Map sites.  Unless there are privacy concerns, this is an easy method at getting multiple links and search traffic to your site.  If there are privacy concerns, it is quite apparent that a lot of businesses are using addresses that are not their own, not remotely related to the company, and in a few cases, don't even exist.  Like I said, I don't take a stance either way on the address issue.  Whatever address you choose to use is your own business and will be pertinent to your site needs.  The pint is to consider adding site traffic through any means possible and to elevate your position in the search engines.  If you find that having multiple addresses within search maps allows for extra exposure, it might be worth considering.
Read more...

Writing for SEO

I found an article at helpmeblogger.com that is great in relation to writing content for SEO without having to sound redundant or confusing.  The link to the original article will be at the base of this post. hidden links -enjoy- spider_botYou'll see lots of advice about "SEO" or Search Engine Optimizing, or "using keywords." Much of that advice can actually hurt your Google rankings, especially "keyword stuffing," or deliberately over-using the terms you think will bring your page to the top ranks in a search. Google and other search engines are constantly changing the way they calculate search rankings, and they're getting smarter about figuring out ways of rewarding quality sites. That means that the best way of making sure your pages and posts have top search engine rankings and appear in the first few results when someone does a search is to write well. Really. Good writing trumps all the deliberate use of SEO keyword techniques. But what, you ask, is good writing? I'm glad you asked, Grasshopper. To begin with, good writing means being as clear as you can about what exactly you're writing about. Use a very clear and specific title for your page or post. Remember that a lot of readers will only see your title in their RSS feed, and the title has to be clear enough that they click to read the post. You want to be accurate, descriptive—and brief. Don't use the same subject phrase over and over—that's called keyword stuffing, and it's not only repetitive and boring, it's down right obnoxious. Overusing a keyword or phrase can suggest to a search engine that you're a spam site, and your ranking will suffer accordingly. Do use the most appropriate descriptive and specific language. Being specific means that your reader doesn't have to guess what you mean or what you're talking about. SPecificity makes your work easy to find. Remember that just because you call cats kitties, doesn't mean that your readers will; so think about synonyms, and likely search phrases. What will help a reader find your post? You might choose to use cats, kitties and felines—and you might also deliberately choose to avoid using pussies, since you don't want to attract porn spam. Google offers some tools to help you write good posts. One of them is the Google keyword tool. This helps you figure out what your readers might search for, in terms of using the best terms for our post or page. There are also some helpful .pdf guides from Google. The Search Engine Optimization Starter Guide is exactly what it looks like; a basic how to in terms of writing that makes your content easy to find. Original Article by by:Lisala at helpmeblogger.com
Read more...

Visualizing Your Traffic Objective – Part 1

Reverse-Ripple Marketing Theory

So here we are, trying to improve our site traffic.  Many people begin this aimlessly and without any clear idea of what they wish to accomplish.  I can speak from experience, that quite often I have done the same.  The misconception is that if you build a site, then send a bunch of invites and plug it to generic media, then people will automatically blow up your Alexa ranking, right?  Not necessarily.  A lot will depend on how targeted your approach may  or may not be.1184141_ripple Visualize The Approach Think of throwing rocks into a small pond or lake.  The ripples will swell outward in various patterns and expanses.  They become even smaller as they travel outward.  As they encounter obstacles and other ripples, their direction shifts in many more obscure ways.  This is the key visualization to increase site traffic... In reverse, of course.hidden links Our goal is to increase site traffic.  This means that we have to create these ripples in their reverse form.  Casting a large net across various related topics to your site will inevitably cause these ripples to develop and move back to their origin in waves of response.   The net has to be cast wide and in substantial volume.  Anyone with a blog presence can blow up their numbers for a few days.  We see companies offering this for exorbitant amounts on a regular basis.   They work with mob-like tactics; as soon as you stop paying, your site traffic immediately drops to the floor.  This is not a reason to run back to them(unless you really enjoy paying money for a service that's rigged for only short term gains). Diversify to Enhance your Marketing View To have a traffic plan with true longevity, you have to think of all of the ripple in the pond.  What happens when the ripple encounters an obstacle? It changes direction; it becomes obscure.  No matter what basis you may think, all ripples of net traffic eventually become obscure.  The point is to use the obscure traffic to grow your site.  All of the direct and sharp traffic is like the ripple closest to the rock.  They are large and even, but they only reach a certain radius.  The same amplitude is expressed with the smaller, uneven ripples that have expanded outward.  What separates them is that they are cast over a much larger area and range.  This wider net causes for slow, sometimes bent and obscure group reaction.1138325_rings_of_water Keep in mind, all waves have roughly the    same amplitude.  With this thought, imagine pushing traffic in small, obscure waves that build in size while traveling back to your website.  This requires a lot of work and interaction with other sites as well.  Using the blogosphere, utilizing Press Releases,  Twitter, Facebook, Myspace, Ning, Digg, Del.icio.us, LinkedIn, ect.... The list could go on for some time.  Now these are the at the tops of most every "At Home SEO Expert", but what of the lesser known sites?  What about blogs that are only minimally related to your goal.  They "ARE" all fair game, regardless of what some experts may say.   The point is to not just think of social networking, not just think of bookmarks, not to just think of the quick response websites,  but to think of every possible eventuality in response to the content you have to offer. My Enemy Is My Friend But what you have to offer may be something to consider as well.  When the ripples from different points of action intersect, they modify each other and cause for warps within their framework.  This is also a benefit if we play the tape in reverse.  By interacting with your competitors, you will wind up giving and receiving more traffic.  By competing in a communicative manner, both sites win more traffic and reach a larger audience. This is only the first of a 5 part series of post.  Using this "Reverse-Ripple Marketing Theory" will be a long, productive process, but the rewards for the work put into it are far better in yield than paying for quick jolt traffic that is bound to fall as quickly as it arrived.
Read more...