All posts by Me

SQL Server Integration Services (SSIS) Exporting From a Stored Procedure in SQL 2005

This is just a post in my continued efforts to publicize all errors that I find in hopes that it will save somebody else a few moments.

I was using the SQL 2005 Management Studio to export the results of a Stored Procedure to a Flat File. This stored procedures made changes to other tables (Deletes, Inserts and Updates) and returned one result set.

I got the followin errors when trying to export this.

- Pre-execute (Error) Messages
Error 0xc02092b4: Data Flow Task: A rowset based on the SQL command was not returned by the OLE DB provider.(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component “Source – Query” (1) failed the pre-execute phase and returned error code 0xC02092B4.(SQL Server Import and Export Wizard)

It seems that the error may have been caused by column headers not getting returned properly or something related to that. I am unsure exactly what the real error was.

The problem seems to have to do with what you use to export the data. The “SQL Native Client” as well as “Microsoft OLE DB Provider for SQL Server” produced the same error above.

I had to use the “.Net Framework Data Provider for SqlServer” in order to get the data exported. This seems like a major problem of SSIS. Although there are workarounds, as it stands right now I was happier with DTS.

15 Days and no Post!

Why haven’t I posted? Most of it has to do with spending a lot of time developing Favorite Part.com. I got a huge influx of traffic but made very little money from it. I am contemplating a redesign however I would prefer to keep it a clean and easy interface.

Another reason I haven’t been available is because of this 1983 Honda CB650. I was having problems with it charging the battery. I replaced the battery and stator (1/2 the alternator) with no luck. I then replaced the voltage regulator and that did the trick.

Yesterday I finally got around to replacing the spark plugs and cleaning the 2 fuel strainers (it doesn’t have a fuel filter per se). I then went to take a look at the air filter. It would have been just shocking to see it in the crappy condition it is in. However, I was slack jawed when I saw it filled with an inch or two of bird seed. The bike’s previous owner was out in the country. Some sort of rodent saw the air fiter as an awesome dry place to store its stolen bird seed.

After cleaning out the seed and putting in some more oil, which it burned some of as evident by the exhaust, it is running MUCH better. It’s just a beater bike for this year and next, but it’s still very sweet to have.

Search Engine Friendly Checklist

This checklist lists 8 key points that are often overlooked when writing a web page or content. By making sure that all 8 of these rules are in effect, you will definitely do much better in search engine rankings. The down side is that it won’t be overnight however. Search engines re-rank pages on their own schedule.

- Each page has a distinct title.
- Each page has a unique description tag.
- Each page is well formed.
- All pages can be found within 4 clicks.
- Each page has unique content from each other and from other pages on the net.
- All URLs are simple.
- Menus are minimized.
- Each page has a target topic.

Each page has a distinct title.
- This is a very important and highly overlooked aspect of web-pages. The more pages that a search engine has cached, the more traffic you will receive from that search engine. To get the search engines to cache as many pages as possible from your site then each page must be different. A major “key” is the web page’s title. For example take FavoritePart.com. Each page has a different title. Each page mentions the main phrase “Favorite Part” it then mentions the topic of that specific page. Some might argue that a better format would be “Favorite Part — TOPIC” instead of “TOPIC Favorite Part”. Do a search for your topic and look at the titles of the results. Chances are that nearly all of the top ranking pages have the target keyword first.

Each page has a unique description tag.
- Another way to differentiate pages on a site from another is the description tag. These should be included and differ from the other pages in a style similar to the title. However it should be targeted to the users of search engines. This will most likely be the description that search engines show when a search is done that includes your page. Going back to our FavoritePart.com example we see that the title and the description is the same. This is not ideal but is acceptable. Better results might be found if the description was slightly more in depth.

Each page is well formed.
- This is simple. Make sure your page does not have errors on it. All HTML tags should be valid and there should be no scripting errors. While there are ways of validating your page online, a very simple test is to view it both through Internet Explorer and FireFox. If the page is the same in each viewer, more than likely this is well-formed enough for the search engines.

All pages can be found within 4 clicks.
- This is not a standard. However for the most pages to be cached by search engines, the should be found easily. If you have a lot of content that is buried under many links, these pages are a lot less likely to be picked up and included in search results. Keeping things within 2 or 3 clicks is better however the visitor is always your main target audience. Keep them in mind first and foremost over all search engines.

Each page has unique content from each other and from other pages on the net.
- If you copy and paste content from another site without adding extra information/text, the search engines will see your page as a duplicate and not include it. If this is a main page then you could lose out on any links that this page has. Sometimes some other site owners really like your content and copy your entire site. This may mean that your page is chosen as the clone. If this ever happens you can contest this with the search engines. It is not a simple quick process but it is better than doing nothing.

All URLs are simple.
- Ideally URLs should be as simple as possible. This is so they can be easily copied and remembered. Check out Wikipedia for the best example. Nearly all of there links are as simple as can be. For example their Christmas link is http://en.wikipedia.org/wiki/Christmas. In the case of an average page you may need to have something more complex. If you have variables in your URL then keep them small and to the point. For example http://www.favoritepart.com?id=1 is better than “favoritepart.com?FavoriteImageIdentificationNumber=1”. An even better way would be to have the URLs formed like “favoritepart.com/id/1”. However this is a bit trickier and many argue that this does not make the search engines like your page any more than having simple ones with variables.

Menus are minimized.
- Do not cram all of a websites links into the front page. Including a link to a site map or having a hierarchical site layout will really spread out page rank in a controllable manner as well as aid visitor navigation.

Each page has a target topic.
- For a page to rank well with no or limited backlinks it needs to be highly targeted and “taggable” by search engines. Include the topic in the URL and have the keywords separated by hyphens where possible. Also the title and description should mention the target keywords or phrase. It might also be a good idea for the description to include keyword derivations and the page should use any topic synonyms when possible.

Using this quick checklist will

Amazon EC2 Side Note

This was taken from their EC2 page. Very interesting!

Completely Controlled
You have complete control of your instances. You have root access to each one, and you can interact with them as you would any machine. Each instance predictably provides the equivalent of a system with a 1.7Ghz Xeon CPU, 1.75GB of RAM, 160GB of local disk, and 250Mb/s of network bandwidth.

Pay only for what you use.
$0.10 per instance-hour consumed (or part of an hour consumed).
$0.20 per GB of data transferred outside of Amazon (i.e., Internet traffic).
$0.15 per GB-Month of Amazon S3 storage used for your images (charged by Amazon S3).

Amazon EC2

Just receieved an email from Amazon’s webservices. It seems they are beta testing their “Amazon EC2″ (Amazon Elastic Compute Cloud). A EC2 is basically a computation cloud that allows you access for $0.10 per hour. “Amazon S3″ (Amazon Simple Storage Service) is part of this.

It will be interesting to find out exactly what this is and if there is any way it can make what we do better or add something that we didn’t know we needed.

Here is the email that was sent to me this morning.

This is a short note to let you know about a limited beta of our newest webservice: Amazon Elastic Compute Cloud (Amazon EC2). Since the beta will bevery limited, we’re only e-mailing a select group of developers at thispoint who have been making Amazon Web Services requests in the past month.
Amazon EC2 is a web service that provides resizable compute capacity in thecloud. It is designed to make web-scale computing easier for developers.
Just as Amazon Simple Storage Service (Amazon S3) enables storage in thecloud, Amazon EC2 enables “compute” in the cloud. Amazon EC2′s simple webservice interface allows you to obtain and configure capacity with minimalfriction. It provides you with complete control of your computing resourcesand lets you run on Amazon’s proven computing environment. Amazon EC2reduces the time required to obtain and boot new server instances tominutes, allowing you to quickly scale capacity, both up and down, as yourcomputing requirements change. Amazon EC2 changes the economics ofcomputing by allowing you to pay only for capacity that you actually use.
At this stage, we’re only accepting a limited number of beta customers, soif you’re interested in Amazon EC2, we recommend you move fast. If you findthat our beta is full by the time you sign up, please accept our apologiesand stay ready. As soon as we can accommodate more participants, we wille-mail those of you who have given us your e-mail addresses and you’ll haveanother chance to try Amazon EC2. Click on the link below to learn moreand get started:

[Link removed as this was peronalized for myself]

Sincerely,
The Amazon Web Services Team

Azoogle Ads 2.0 Preview

A short while back I mentioned how Azoogle Ads updated their first login page. I also surmised that this was a prelude to a whole new interface for Azoogle. It seems that guess was correct.

For those of you that have Azoogle Ads, when you logged in you should have seen this banner. There is no set release date yet.

For those of you that do not have Azoogle Ads, Azoogle is a CPA (Cost per Action) advertising network. That means that Azoogle provides you with links. You can choose any of these links and if somebody clicks on them, visits the target website and does an action (could be buying something or filling out a form) you earn money.

Azoogle Ads has been better than similar places like Commision Junction due to 2 things. Azoogle has real time reporting and Azoogle lets you choose any link/service that you like. CJ’s reporting is rather slow and you need to apply to each affiliate and be accepted. The last part makes testing out new campaigns rather difficult.

If Azoogle can keep their interface clean while giving better reporting options, Azoogle’s users will be armed with the best tools available.

You can test drive the new Azoogle Ads 2.0 HERE.

You can register for Azoogle Ads 2.0 HERE.

Cool UFO Videos

Here are some cool UFO videos. A part of us wants to think they are real, but the sceptic in us might prevent it. Judge for your self.

There is sound so you may want to have your sound turned on for these.

Pre 9-11 UFO by the World Trade Center (Quite Honestly the Best)

Possible UFO Crash (or Frozen Plane Waste)

UFO in Bulgaria

Live Crop Circle Fomation

Alien Seen in Car Comercial

AdWords Prefers Older Sites

I am constantly playing around with online marketing and often use Google’s AdWords program. I have come to realize that Google does not like new sites. Google costs less for sites that have been tagged as pertaining to certain topics. In my experience this may be half as much as it normally would be or less!

If you are serious about internet marketing, be sure to give your sites time before using an AdWords like click campaign.