Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.
View this topic in a long page with up to 500 replies per page Create new topic
1 | 2 | 3
BDFL - Memuneh
61506 posts

Uber Geek
+1 received by user: 12222

Administrator
Trusted
Geekzone
Lifetime subscriber

  Reply # 580667 13-Feb-2012 08:49
Send private message

A bit unrelated, but software... A few years back we started doing some rework behind the scenes on Geekzone. This included a lot of database changes and script rewriting. That's because I come from an era where mainframe were priced in millions of dollars and we had to squeeze every single instruction out of them, to make it worth it.

In one of the Geekzone Pizza an employee of our then hosting provider said he was impressed with how many operations we managed to do in our server - other customers would simply add another server to the rack and be done with it - until the next time something got slow again.

Their root cause was never fixed. The solution is not adding more hardware, but it's a cheap workaround compared to paying developers to get things in order but in the long run it's not cheap when someone is waiting 19 hours for a calculation to be performed. Time is money...





14216 posts

Uber Geek
+1 received by user: 2572

Trusted
Subscriber

  Reply # 580670 13-Feb-2012 09:03
Send private message

I've seen cases where 19 hours of processing can be done in 30 seconds when it's reworked. This was a database script, but would apply to Excel as well.

A new PC might help reduce things by 10-50%, cost maybe $1500 - $2500 given the specs you want. A good software developer might take 1 hour or 2 days to fix the script, at between $50 and $150/hour.




AWS Certified Solution Architect Professional, Sysop Administrator Associate, and Developer Associate
TOGAF certified enterprise architect
Professional photographer


1493 posts

Uber Geek
+1 received by user: 191

Trusted

  Reply # 580744 13-Feb-2012 10:54
Send private message

freitasm: A bit unrelated, but software... A few years back we started doing some rework behind the scenes on Geekzone. This included a lot of database changes and script rewriting. That's because I come from an era where mainframe were priced in millions of dollars and we had to squeeze every single instruction out of them, to make it worth it.

In one of the Geekzone Pizza an employee of our then hosting provider said he was impressed with how many operations we managed to do in our server - other customers would simply add another server to the rack and be done with it - until the next time something got slow again.

Their root cause was never fixed. The solution is not adding more hardware, but it's a cheap workaround compared to paying developers to get things in order but in the long run it's not cheap when someone is waiting 19 hours for a calculation to be performed. Time is money...




+1 it may help by throwing newer hardware at it but at some stage the benefits of doing this will dwindle. Remember that nowadays CPU's are scaling outwards (multi-core) and not upwards (CPU speed) so there is a limit on how beneficial a new CPU would be on a single threaded calculation. You really should invest in getting it rebuilt for multi-threaded calculations sooner rather than later.


Or, as has been pointed out, get it ported onto a proper number crunching application. 

36 posts

Geek


  Reply # 580768 13-Feb-2012 11:33
Send private message

One of the problem's is that many people have a  "Moore's law" view of anything related to IT and better use of hardware is not seen as cost efficient as it was back in the mainframe days.  The problem is that Moores law can be used to cover up software inefficiencies without the need of tackling the root problem. 

There is a fundamental shift needed amongst bean counters.  A number of things come into play.
Any capital item has a life span and the cost of that item to the business is calculated across that expectation and the depreciation calculated over that lifespan that will cover capital cost and maintenance. 

An application built on a piece of software is an Appliance and that appliance should have a lifespan.  Given the situation, I would say that over the projected life of the business a few Developer hours would be money well spent, because the more it's put off the more it will cost downstream and the chances of expensive and catastrophic failure come under Moore's law too.
Time to retire the appliance and upgrade to one that is more in keeping with the realities of the business in the present day rather than trying to squeeze a bloody great V8 into a model A pickup and then expecting it to do the work of a Mack Truck.

You will forgive an engineers analogy, I designed and built mobile Vacuum loading plant before I got into this game.

If you have a big super sucker and it is slowly wearing out, you can overcome some of the wear and tear problems just by putting a bigger motor in it, it might even make it work better.  But at the end of the day you're still trying to suck through a hose that's too small to a pump without enough volume, to storage that is too small, because the job that needs to be done has grown, beyond the capabilities of the tool and applying more power is just going to end up breaking something and then the cost becomes an unplanned major expense, rather than budgeted maintenance, better to invest in a machine more suited to the job.  The product hasn't changed, the inputs and outputs remain the same, just the process occurs without stressing the whole system.

Database doesn't have to cost a lot, there are a lot of PostGRES guys out there doing a lot of cool stuff.  There is no need to put up with the Oracle or MS or IBM or SAP licensing costs.   Getting the data from excel into a DB is pretty trivial, or should be. The heavy lifting will be around the number crunching.  Money well spent for mine. 

Spreadsheets are not really that scalable, especially when it comes to fault finding and error checking.  A well designed high end SQL DB will do all that in it's sleep and will scale for years.

1493 posts

Uber Geek
+1 received by user: 191

Trusted

  Reply # 580801 13-Feb-2012 12:22
Send private message

Slow Clap



138 posts

Master Geek
+1 received by user: 7


  Reply # 580812 13-Feb-2012 12:35
Send private message

Agree about the V8 in a Model A analogy. But as I've mentioned before, we're having to make the best of a less than ideal situation. A more efficient and optimised number crunch will probably need a rewrite of the whole thing, and the client probably can't afford it or wait that long. And I can indeed confirm the i5K was slower because it can't multi-thread like the i7 & Xeon.

The client has a budget of up to $5000 incl GST, so cost isn't that much of an object for the new PC. I still get the feeling though that trial-and-error is the only way to find out how much faster it would be with, say, a dual-processor hex-core setup. And if the gamble doesn't pay off, it's going to be a money pit either way.

5149 posts

Uber Geek
+1 received by user: 1672


  Reply # 580832 13-Feb-2012 13:10
Send private message

I would have thought that $5000 odd would get you a decent chunk of a developer's time that could optimise the calculations, assuming you can find a person with the right expertise.

Resolving inefficiencies in the calculation may improve speed by an order of magnitude (or more), but even a very quick machine is unlikely to see those sort of improvements for a few years.

1456 posts

Uber Geek
+1 received by user: 324

Trusted

  Reply # 580852 13-Feb-2012 13:44
Send private message

codyc1515: Maybe look into setting up a high-performance instance on Amazon AWS EC2, on occasion, if you're that way inclined. ;)


Have you investigated this yet, or ruled it out already?

1366 posts

Uber Geek
+1 received by user: 16


  Reply # 580862 13-Feb-2012 14:04
Send private message

RunningMan: I would have thought that $5000 odd would get you a decent chunk of a developer's time that could optimise the calculations, assuming you can find a person with the right expertise.

Resolving inefficiencies in the calculation may improve speed by an order of magnitude (or more), but even a very quick machine is unlikely to see those sort of improvements for a few years.


+1000 

8027 posts

Uber Geek
+1 received by user: 387

Trusted
Subscriber

  Reply # 581444 14-Feb-2012 17:06
Send private message

$5000 is ~40hrs of developer time @ $125/hr ... ~80hrs @ $60ish/hr...

Depending on what it actually does, development is probably the better option as you seem to be hitting limits of throwing hardware at excel.

118 posts

Master Geek
+1 received by user: 41


  Reply # 581527 14-Feb-2012 19:42
Send private message

timmmay: 19 hours is pretty insane. Rather than a new PC how about either:
- Having someone with a lot of macro/performance experience look at what it's doing. If it's badly written that could reduce the run time by a huge amount
- Re-engineering the solution is a much better idea

Software developers are expensive though.

I've done a lot of that sort of optimization in Excel - in addition to testing spreadsheets, which almost always contain errors (see our bibliography of spreadsheet errors and testing).

Depending on exactly what the spreadsheet is doing, it may be possible to reduce the run time by a factor of 10 or even 100. That is, a run time of a couple of hours, or maybe even just a few minutes, rather than 19 hours. The OP stated that the spreadsheet is only 16 MB, which isn't large relative to some I've seen, so it sounds like a very repetitive processing task - exactly the sort of thing where some carefully applied optimization could dramatically improve performance.



138 posts

Master Geek
+1 received by user: 7


  Reply # 582263 16-Feb-2012 11:07
Send private message

We just brought in some X79 gear for another order, so we had the opportunity to give the number-crunch a spin on it. 15.5 hours later, I'm coming to the same conclusions as the rest of you.

It's been running non-stop on an i7-3930K (6 cores + 12 threads), a Gigabyte GA-X79-UD5, 32GB of DDR3-1600 RAM (I remembered to set it to XMP mode), and an SSD. Once again, RAM usage is minimal.

gzt

10261 posts

Uber Geek
+1 received by user: 1578


  Reply # 582273 16-Feb-2012 11:19
Send private message

If the script is particularly old you might be able to get some easy extra mileage implementing even one or two of the global items from the many excel macro performance checklists. ie;

http://www.excelitems.com/2010/12/optimize-vba-code-for-faster-macros.html

Even so: the you change it - you own it rule will no doubt apply, and who knows what latent problems might exist there already.

118 posts

Master Geek
+1 received by user: 41


  Reply # 583224 18-Feb-2012 16:01
Send private message

A skeptic asked me to justify my claim earlier in this thread that a spreadsheet could be made to run 10 or even 100 times faster. OK, challenge accepted.

I've put together an example: Improving spreadsheet run time.

In this example a simple, though highly repetitive, task is performed by two methods; each consisting of just a few lines of VBA. One method takes almost 5 minutes to run, while the other takes a small fraction of a second. The result is that the second method is 12,000 times faster than the first!

Of course, this is a rather extreme example. But I think it makes the point, especially as most spreadsheet developers would use something like the first method as their approach to this task.

The example spreadsheet is available for download, so try it for yourself.

688 posts

Ultimate Geek
+1 received by user: 15

Trusted

  Reply # 583233 18-Feb-2012 16:22
Send private message

Another way to look at it may be this:  Would the client benefit from a much shorter run time?  If there would be an advantage to an increasased efficiency (Guessing at a factor of 10, may be higher or lower who knows.  Though probably much quicker!) then it may be worthwhile for that alone.  A family friends' small company had a pretty small Excel file with a script they ran that took about 45 mins on their machine, which they had to run often at unplanned times.  I put them onto another friend (since retired) of mine that sorted out the script so it took roughly 20 seconds to run.  Everytime they ran it, they litterally had a person sitting around waiting (as it was necessary for the work they were doing) wasting many $$$ in wages.

So my question is this:  Would any time shorter than the 12 hours be useful to them? Would, for example, 2 hours be better? Or 20 Minutes? If it is a case of time is money, then the potential savings could potentially pay themselves off really quickly.

1 | 2 | 3
View this topic in a long page with up to 500 replies per page Create new topic

Twitter »

Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.