Subject: Re: [nmr_sparky] Relaxation Peak Heights (rh) Hangs (NMRFAM v. 1.414)
From: Woonghee Lee
Date: Aug 23, 2019

Previous: 1476

Dear Ryan,

If your data fitting goes well with the exponential function, y= a exp(bx), it should not take that long. 
It may be related but the number you see below does not indicate your peak count necessarily. If fitting fails and iterates, the number will increase over time. So it can be much higher. You can just presume how far it went for getting fitted. If you are using recent versions, you can just click Nessy button to generate multiple peak lists at once while it is only to generate inputs for Nessy and run it. If you want me to look more closely, you can share me the data privately.

Best,
Woonghee

On Fri, Aug 23, 2019 at 9:43 AM rb.williams@... [nmr_sparky] nmr_sparky@yahoogroups.com wrote:
 

Hello,


I am trying to use Sparky to analyze an exponential decay. Currently I have five spectra loaded (test, test:1, test:2, etc.), and I am using relaxation peak heights (rh). In the setup page, I have set the times for each spectrum.


My protein is small, with only about 50 peaks assigned. I have removed peaks that are overlapping and have questionable lineshape, so it should be a good data set. We are using NMRFAM-SPARKY 1.414 (Sparky 3.135).


When I try fitting a small number of peaks for all five spectra, or when I fit all peaks for just four spectra, the program runs fine. However, if I attempt to select all peaks (pa) and then do the fit (Update), the program hangs - it is using 100% of the processor and runs for 10-15 minutes at least (I have to force it closed). If I do update no fit the program runs fine, but there is no fitting.


Im also not sure why the program reports that there is 2739 peaks in the Peak Height Analysis window. Shouldnt there only be 5*50 = 250 peaks?


Regardless, is there a reason that the program hangs? Is my only choice to fit 5 residues at a time? That seems rather tedious, although I could also export the heights and fit them separately.


Thanks,

Ryan