Skip to content

Back to Business!

September 28, 2012

Whew! These last two weeks have been rather refreshing. Rather than having to think of new business rules and routines, I’ve been left to just do programming. Imaginably, I’m a good bit better at that than I am at business management.

The first task to be tackled was that one of our distributors was starting to send out a new data feed for its inventory information; the problem there was that the feed was in a different format. Originally, their feed had a comma-delimited string showing their warehouse availability (like Y,Y,Y,Y, to show that it’s available in all four warehouses). However, their new feed doesn’t have column headers, and the warehouse indicators are individual columns. I handled this problem by importing the feed into SQL Server, doing string concatenation with commas on the indicator columns, and depositing the reformatted data into the usual holding table for the feed. However, we also needed to get this information into Microsoft Access, and Access was expecting column headers different from those I use in SQL Server. Further adjustments were done by exporting the holding table into a text file, then using PowerShell to do a string replace on the file contents to get the column headers right. The text file was then linked into Access, and it’s imported into the Access version of the table as a replacement for the old routine we had for this feed.

One regret I had on handling this problem was that I couldn’t use PowerShell to download the file from the distributor, which would’ve given us an easy all-in-one solution; when I attempted the download through PowerShell, even using an authenticated session, I’d download 6KB of the 2MB file, and the download would terminate. Not sure why it did that, but I’m guessing the distributor doesn’t want command-line connections to its datastore. The file’s not on an FTP server, either, so my usual routines for getting FTP data aren’t applicable. Still, it only adds a minute to the procedure to manually download the file, so it’s not too bad.

Next up was the process of finding the average shipping cost over the past 30 days for each item we’d sold over that time span. The trick here was that orders that had multiple items shouldn’t be used in the calculation, since differentiating the shipping costs in our database isn’t possible; they’re all summed together in the shipment information table. So, using a series of CTEs and aggregates, I was able to prune down the orders table to single-item orders, then join those orders’ items to the shipment info table, then sum up the shipment costs and average them out. It was a little tricky figuring it all out, since the shipment info table doesn’t have the item numbers in it, but doing the calculations separately and joining the results worked very nicely. It’s a great technique I’ve learned from a ton of reading at SQLServerCentral.com :).

The last task for this week, which will need to be done over the weekend, is doing some capacity planning for a new Access database file. Our master product tables are starting to bloat, and they’re causing our data file to hit the dreaded (and tiny) 2GB limit for Access files rather often, so I’m planning to move some of those tables into their own data file and re-point the links in our front-end to the new file. It should be a rather easy and painless procedure, and from what I’ve calculated, this should buy us a huge amount of extra breathing room. It would definitely be preferential if Access could abolish the 2GB limit on its files, but I’m guessing that would require rather drastic changes to the base coding for the program.

In any case, it feels quite good to get back to pure programming. I’d really rather do nothing else if I could, but the business is shorthanded these days, so everyone’s having to do everything. In fact, just last week, one of the accountants was being trained to do shipment preparation work, and one of the inventory crew was being trained to do accounting! I’m really not quite sure that’s the best idea, though.

From → Programming

Leave a Comment

Leave a comment