Subject: Re: [nmr_sparky] Re: hello....slow spectra refreshing
From: P pirate
Date: Oct 19, 2006

Previous: 242 Next: 248


Hello Tom,
thanks a lot for your comments.
I posted before that I did convert to UCSF format and the result was a little better but still very slow. I also reported that I increased cache to 64 MB and also to 128 MB and almost the same even at second time loading spec.

When I started posting this problem I have was just to make sure I wasnt forgetting anything in the settings since Im new using sparky. Maybe it was my mistake trying to compare Sparky with NMRview and with Auremol (which is unbelievable fast but very expensive).

I insisted with the problem because you know better than I do that many times we need around 20 contour lines do efficiently tell one peak from another. And its already a big puzzle to get all of them assigned and if during the process there is all this refreshing going on ...arghhhh...:(

Thanks again and let me know if there is plans to make Sparky faster. It looks very promissing indeed.

Just to finish...did you look to my post about the non-pyton version ...it opens but because gives the error :Sparky/Lib/Sparky ...no such file or directory.


Best regards:
PP






Thomas Goddard goddard@... wrote:
Hi,

Here is some info about Sparky contour drawing speed.

Sparky computes contour lines and displays them. The computation takes
longer than the display. Sparky remembers the computed contours for
specific contour levels you use. The second time it draws them can be
significantly faster. It will only remember them if the NMR data cache
size set in the Sparky File / Preferences dialog is large enough. Note
that the default value cache value is 4 Mbytes, established in the day
of much smaller computer memories. Id suggest increasing to 64 Mb at
least. The setting is only saved in Sparky spectrum session files
(save files) so it wont automatically be used whenever you use Sparky
unless you open a file that has that setting. (Sparky has no global
preferences file.)

The earlier suggestion in this thread to try converting the Bruker format
spectrum to UCSF format was worth a try. The size of spectrum blocks
within the file can effect performance, especially if the Sparky cache
size is too small to keep all the data you are viewing in memory.

You may find Sparky drawing of contours slow even when they are
being cached. Sparky was written for a much older generation of
computers (mid 1990s) and what was needed for optimal performance back
then (small blocks of data, 32 Kbytes) are no longer valid. As an
example, contouring an entire 2-D spectrum, 2048 by 4196 data points
with 7 positive and 5 negative contour levels takes about 10 seconds
the first time it is displayed, and 3 seconds the second time on a 2
GHz Mac dual G5 system. I would expect the time to be proportional to
the number of contour levels. It may also depend on how low your
contour levels are set. In a modern implementation using OpenGL I
would expect the display to be instantaneous once the contours are
cached. The main bottleneck after caching is probably that rendering
one at a time each of the small 32 x 32 data point tiles that Sparky
uses is very inefficient with current window systems.

Tom


Get your own web address for just $1.99/1st yr . Well help. Yahoo! Small Business .