[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Spreadsheet Limitations - Thanks
Didn't know where you were getting the data, or I could have been more
precise. Contact me if you want more detail, since we're using GPS downloads
onsite.
Jack Earley
Radiological Engineer
Enercon Services, Inc.
6525 N. Meridian, Suite 503
OKC, OK 73116
phone: 405-722-7693
fax: 405-722-7694
jearley@enercon.com
************************************************************************
This e-mail and any of its attachments may contain Enercon Services, Inc.
proprietary information, which is privileged, confidential, or subject to
copyright belonging to Enercon Services, Inc. This e-mail is intended solely
for the use of the individual or entity to which it is addressed. If you
are not the intended recipient of this e-mail, you are hereby notified that
any dissemination, distribution, copying, or action taken in relation to the
contents of and attachments to this e-mail is strictly prohibited and may be
unlawful. If you have received this e-mail in error, please notify the
sender immediately and permanently delete the original and any copy of this
e-mail and any printout.
----- Original Message -----
From: "Tom O'Dou" <todou@cabreraservices.com>
To: "Radsafe (E-mail)" <radsafe@list.vanderbilt.edu>
Sent: September 25, 2001 4:22 p.m.
Subject: Re: Spreadsheet Limitations - Thanks
> Thank you all very much for your response.
>
> The general opinion seems to be that a database program such as access
would
> provide much more data handling and analysis capability. Several other
> options were also provided. I will look into them all (all are in this
> message - up to 1415 on 9/25/01).
>
> The data we have is collected by in one second intervals from a GPS unit
and
> eight sodium iodide detectors. The data is logged into a program called
> pathfinder where it is exported to an excel file. In excel the data is
> refined and exported to Surfer 7.0 where it can provide us a display of
the
> response as a function of global position over our survey area.
>
> Our specific method of dealing with the situation now is by splitting the
> data into multiple sheets in an excel workbook. However, the management
of
> the large number of spreadsheets is difficult (presents some challenges).
>
> Responses:
>
> MS Access and MySQL come to mind as good databases to use.
> Would probably require a fair bit of front end work developing things like
> the DB structure and interface, but better for long term managability.
>
> Another quick and dirty method would be to store the data in a text or
> binary format file, and use something like perl to extract and analyze
> the data. Quick and dirty, but long term probably a pain in the butt.
>
> If you're set on the spreadsheet thing, you might consider splitting the
> data
> up over several spreadsheet tab or even separate spreadsheets files and
> linking them together using a VB or VBA application to do your analysis.
>
> Eugene Mah
>
> I am no computer expert, but to sounds to me that a spreadsheet program
may
> not be what you need. A database program (e.g., Access) would be able to
> handle the data and probably provide many more features to automatically
> evaluate and report the data.
>
> I think it used Paradox that I used during the Fort St Vrain
decommissioning
> to, at the click of a mouse, accept downloaded data from field
> instrumentation and data from laboratory counting equipment, group the
data
> into the appropriate population to define the survey units, statistically
> evaluate, report outliers for further evaluation and generate the tabular
> portion of the final report.
>
> Gerald Rood
>
> The simplistic answer is that you may need to go to a relational database
> (Access or that ilk). This is especially true if you are carrying a lot
of
> redundant calculations in your spreadsheet columns.
>
> By going relational, you can break up your datasets, cross-reference them,
> and then create your output as you need to.
>
> Jim Barnes, CHP, RRPT
>
> Recommend using the worksheets within the spreadsheet workbook. You will
> have to break up your data recording to discrete increments to record in
the
> worksheets. How much data is associated with each record? I have built
> some complex spreadsheets to perform a number of functions within our
> organization can not give you much more advice without knowing more
> information. I've found that at some point a spreadsheet when used to
track
> data needs to be converted to a database due to the clunkiness associated
> with pulling reports together and analyzing the information.
>
> Greg Larson
>
> I would try using a different spreadsheet software program. I have used
> Sigmaplot running with Labview for many experiments that Excel could not
> handle. Not sure how many datapoints you can save, but it is significant.
> You may want to look at other programs as well.
>
> Jason Harris
>
> Your best bet is to convert the data into a database such as Microsoft
> Access. This also provides you with much more powerful data analysis
> capabilities.
>
> Fritz Strydom
>
> I would think you will need to switch to a different tool. Possibly, use
a
> database to collect the datasets and link the spreadsheets to the tables
in
> the database to perform your data reduction/analysis.
>
> George Crowley
>
> We have had similar problems at our facility and finally made the
transition
> over to using an Access database. The nice thing about Access is that it
is
> very compatible with excel and other software. Hope this helps
>
> Dave Schoep
>
> Try the new version of File Maker Pro. It should be able to
> handle it.
>
> Tom Shelley
>
> Yes - An Excel workbook can be set up with multiple "spreadsheets". You
can
> set up one spreadsheet for water, one for soil, and one for air sample
data.
> Another spreadsheet can be setup for data manipulation and summary
> presentation. Computer technology is often only limited by human
> imagination.
>
> Tom Bracke
>
> Lacking any information on how you are collecting, importing and
processing
> the data, it's hard to give you any specific suggestions. The things that
> come immediately to mind are:
>
> * Put the data in arrays rather than a single column. Then a single sheet
> can hold 16+ million data points.
>
> * Segment the data into smaller chunks, process the chunks and then
process
> the processed data.
>
> Rick Edwards, Analyst
>
> I asked our resident statistician/programmer/CHP/CIH/CSP if this could be
> accomplished in Access instead. His response follows.
>
> "Access will accommodate the hundreds of thousands of records you are
> talking about. Access 95 had a limit of 1 gigabyte for the size of any
> individual database, I think. In Access 2000, the maximum individual
> database size is 2 gigabytes. You can have 255 columns (or fields) per
> database table, and no table can exceed 1 gigabyte. Since you can link
one
> database with another then total data storage is limited to the storage
> capacity of the computer equipment. When queries are properly configured,
> number crunching in Access is quite fast also, but not as fast as in
Excel.
>
> It is possible to do most things in Access that most people do in Excel
> simply through the use of queries and macros (without Visual Basic
> programming). When it comes to drawing charts and graphs, Access can do
it,
> but I would suggest that you instead export the appropriate data (either
> directly or automatically through a macro) into Excel and let it do the
> charts. Tables containing massive amounts of data in Access (too large
for
> Excel) can be processed (and reduced) and then exported to Excel to do the
> charting.
>
> For certain kinds of processing, Excel is better suited for the job than
> Access. If the amount of data does not overload Excel or can be given to
> Excel in segments, then it is reasonably easy to make Access and Excel
> communicate with effectively two-way communication.
>
> Let me point out that Excel can have up to (I think) 255 worksheets per
> workbook and workbooks can also be linked together. However, I have
noticed
> that when Excel is handling huge amounts of data, there seems to be a
> threshold above which Excel slows down to an unacceptable crawl-perhaps
when
> all the RAM memory is used up and virtual memory is utilized. Also, you
may
> find Excel occasionally saying that there is not enough memory. I guess
> there are some things that require RAM memory only."
>
> Jack Earley
>
> There are two ways to deal with this, both involve macros. A simple macro
> will allow you to place the information vertically but observe it
> horizonally over the range of the screen, i.e., it is always viewed
> horizonally. Another is a bit more involved. It requires a standalone
> macro that collects and enters the data in a very simple spreadsheet.
> Calculations are performed in Excel by either calling Excel by the macro
or
> having Excel call the data.
>
> Joe Alvarez
>
> Microsoft Access or some other database
>
> Ian Fraser
>
> A database might be a solution for such a large amount of data. Access or
> Oracle would be possible solutions depending on how the data is accessed
or
> manipulated.
>
> Louie Cueva
>
> ************************************************************************
> You are currently subscribed to the Radsafe mailing list. To unsubscribe,
> send an e-mail to Majordomo@list.vanderbilt.edu Put the text "unsubscribe
> radsafe" (no quote marks) in the body of the e-mail, with no subject line.
>
>
************************************************************************
You are currently subscribed to the Radsafe mailing list. To unsubscribe,
send an e-mail to Majordomo@list.vanderbilt.edu Put the text "unsubscribe
radsafe" (no quote marks) in the body of the e-mail, with no subject line.