SQL Server Integration Services (SSIS) Exporting From a Stored Procedure in SQL 2005

This is just a post in my continued efforts to publicize all errors that I find in hopes that it will save somebody else a few moments.

I was using the SQL 2005 Management Studio to export the results of a Stored Procedure to a Flat File. This stored procedures made changes to other tables (Deletes, Inserts and Updates) and returned one result set.

I got the followin errors when trying to export this.

- Pre-execute (Error) Messages
Error 0xc02092b4: Data Flow Task: A rowset based on the SQL command was not returned by the OLE DB provider.(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component “Source – Query” (1) failed the pre-execute phase and returned error code 0xC02092B4.(SQL Server Import and Export Wizard)

It seems that the error may have been caused by column headers not getting returned properly or something related to that. I am unsure exactly what the real error was.

The problem seems to have to do with what you use to export the data. The “SQL Native Client” as well as “Microsoft OLE DB Provider for SQL Server” produced the same error above.

I had to use the “.Net Framework Data Provider for SqlServer” in order to get the data exported. This seems like a major problem of SSIS. Although there are workarounds, as it stands right now I was happier with DTS.

15 Days and no Post!

Why haven’t I posted? Most of it has to do with spending a lot of time developing Favorite Part.com. I got a huge influx of traffic but made very little money from it. I am contemplating a redesign however I would prefer to keep it a clean and easy interface.

Another reason I haven’t been available is because of this 1983 Honda CB650. I was having problems with it charging the battery. I replaced the battery and stator (1/2 the alternator) with no luck. I then replaced the voltage regulator and that did the trick.

Yesterday I finally got around to replacing the spark plugs and cleaning the 2 fuel strainers (it doesn’t have a fuel filter per se). I then went to take a look at the air filter. It would have been just shocking to see it in the crappy condition it is in. However, I was slack jawed when I saw it filled with an inch or two of bird seed. The bike’s previous owner was out in the country. Some sort of rodent saw the air fiter as an awesome dry place to store its stolen bird seed.

After cleaning out the seed and putting in some more oil, which it burned some of as evident by the exhaust, it is running MUCH better. It’s just a beater bike for this year and next, but it’s still very sweet to have.

Search Engine Friendly Checklist

This checklist lists 8 key points that are often overlooked when writing a web page or content. By making sure that all 8 of these rules are in effect, you will definitely do much better in search engine rankings. The down side is that it won’t be overnight however. Search engines re-rank pages on their own schedule.

- Each page has a distinct title.
- Each page has a unique description tag.
- Each page is well formed.
- All pages can be found within 4 clicks.
- Each page has unique content from each other and from other pages on the net.
- All URLs are simple.
- Menus are minimized.
- Each page has a target topic.

Each page has a distinct title.
- This is a very important and highly overlooked aspect of web-pages. The more pages that a search engine has cached, the more traffic you will receive from that search engine. To get the search engines to cache as many pages as possible from your site then each page must be different. A major “key” is the web page’s title. For example take FavoritePart.com. Each page has a different title. Each page mentions the main phrase “Favorite Part” it then mentions the topic of that specific page. Some might argue that a better format would be “Favorite Part — TOPIC” instead of “TOPIC Favorite Part”. Do a search for your topic and look at the titles of the results. Chances are that nearly all of the top ranking pages have the target keyword first.

Each page has a unique description tag.
- Another way to differentiate pages on a site from another is the description tag. These should be included and differ from the other pages in a style similar to the title. However it should be targeted to the users of search engines. This will most likely be the description that search engines show when a search is done that includes your page. Going back to our FavoritePart.com example we see that the title and the description is the same. This is not ideal but is acceptable. Better results might be found if the description was slightly more in depth.

Each page is well formed.
- This is simple. Make sure your page does not have errors on it. All HTML tags should be valid and there should be no scripting errors. While there are ways of validating your page online, a very simple test is to view it both through Internet Explorer and FireFox. If the page is the same in each viewer, more than likely this is well-formed enough for the search engines.

All pages can be found within 4 clicks.
- This is not a standard. However for the most pages to be cached by search engines, the should be found easily. If you have a lot of content that is buried under many links, these pages are a lot less likely to be picked up and included in search results. Keeping things within 2 or 3 clicks is better however the visitor is always your main target audience. Keep them in mind first and foremost over all search engines.

Each page has unique content from each other and from other pages on the net.
- If you copy and paste content from another site without adding extra information/text, the search engines will see your page as a duplicate and not include it. If this is a main page then you could lose out on any links that this page has. Sometimes some other site owners really like your content and copy your entire site. This may mean that your page is chosen as the clone. If this ever happens you can contest this with the search engines. It is not a simple quick process but it is better than doing nothing.

All URLs are simple.
- Ideally URLs should be as simple as possible. This is so they can be easily copied and remembered. Check out Wikipedia for the best example. Nearly all of there links are as simple as can be. For example their Christmas link is http://en.wikipedia.org/wiki/Christmas. In the case of an average page you may need to have something more complex. If you have variables in your URL then keep them small and to the point. For example http://www.favoritepart.com?id=1 is better than “favoritepart.com?FavoriteImageIdentificationNumber=1”. An even better way would be to have the URLs formed like “favoritepart.com/id/1”. However this is a bit trickier and many argue that this does not make the search engines like your page any more than having simple ones with variables.

Menus are minimized.
- Do not cram all of a websites links into the front page. Including a link to a site map or having a hierarchical site layout will really spread out page rank in a controllable manner as well as aid visitor navigation.

Each page has a target topic.
- For a page to rank well with no or limited backlinks it needs to be highly targeted and “taggable” by search engines. Include the topic in the URL and have the keywords separated by hyphens where possible. Also the title and description should mention the target keywords or phrase. It might also be a good idea for the description to include keyword derivations and the page should use any topic synonyms when possible.

Using this quick checklist will