Re: [buildcheapeeg] EDF and DDF file formats

From: Sar Saloth (sarsaloth_at_yahoo.com)
Date: 2002-03-13 21:28:16


>
>Ok; another question. When I first wrote the data acquisition routines
>for the
>ProComp+, I was grabbing an entire round-robin set (which would be 24 sets of
>6-byte packets). The checksum for this set is the 23rd packet, and the sync
>bytes are the last (24th) packet. At this point I would send the data on
>-- 24
>samples of EEG data and 3 samples of non-EEG data (GSR, HR, etc.). This
>enabled me to guarantee that the data was correct. But then I was unsure if
>sending data in "bursts" like this was the best way, so I changed it to check
>each 6-byte packet according the formatting rules for that packet (i.e.,
>what I
>expect it to look like--is this a data packet, is it a control word packet for
>sensor status, battery level, checksum, etc.? see the Verify() method in class
>ProCompPacket) and if it checks out, send the sample data immediately. If it
>does not pass the verification test, I stop reading data until I find the sync
>and begin the process again. In this way I am not holding data until I
>can get
>the footer checksum/syncs, but then again, there is the possibility of sending
>corrupt data if the checksum fails, or I am suddenly not in sync with the
>packet boundaries and do not realize this until I do not get the expected sync
>footer when I should.

As far as the data file goes, I would continue writing the data, including
the junk and bad samples to get a complete character and not get the data
out of synch, and tag it with the suggested "Bad Data Marker" bit map
stream. That would fit in with your current logic flow, it wouldn't impact
much else, and if the BAD DATA Marker were at the end of the record, you
could tag it afterwards. That way your EDF would always line up right.

This brings up an annoyance with EDF - they organize the channels
contiguously in one record, so that if your record size was larger (due to
the desire to keep the low data rate signals low) then you are up to the
latency of a record. -
explanation:
Every data record has identical size and channel grouping, so the slowest
rate signal must be present in every data record. Either one pads and
wastes space (by duplicating slow channels) or one accepts the latency is
at least as bad as the low data rate. (for comm purposes).

Does that throw EDF out of the window for future CheapEEG communication
formats? (this is separate from the actual file format issue).

>I like doing it this way because I am able to send the data out immediately,
>and I have to think that it would be rare to receive a corrupt packet
>set. But
>what do you (or others) think? Would it be better and/or indifferent to send
>data out in 8 sample bursts (for EEG) and 3 sample bursts (for non-EEG) data
>for this device? Is there a situation where it would matter?
>

Wouldn't the only real difference be latency? Would such latency be adequate?

It would matter if you had a processor with only a small number of bytes of
memory as EDF is more or less as you just described above and your data
communication format is organized differently, so a complete record would
need to be stored. I doubt this impacts anyone here, I am thinking of
stupid 8051 type processors with 256 bytes of RAM or so.

>Does this effect data storage using the methods we have been talking about
>since we are storing "chunks" and if I suddenly realize that I am no longer
>synced on correct data boundaries? Should I just discard whatever I have of
>the current set, or save it even though this might mean that I only have
>say, 6
>EEG samples (and no non-EEG samples) that I can reasonably be sure of at that
>point for the current packet set?

As along as you save complete EDF records and mark them as bad then your
current scheme should be OK, is that right?

> >> What would be our minimum resolution be (if any)? Would we ever
> >> have, say, < 1 second, and thus have several "chunks" per second?
> >
> >Yes, definitely. For ProComp, you could use 1/32 sec, which gives 8
> >EEG samples and 1 of each of the other samples.
>
>Got it. However, what if something like the above situation occurs, and I
>suddenly lose the sync when I am at the beginning of a set, and only have
>6 EEG
>samples and now have to wait for the sync to start all over again. What do I
>do with those 6 samples?
>
>Dave.

Wow, 1/32s latency should be OK for any display requirement, shouldn't
it? I mean, movie films are only 24 frames per second, and that is
regarded as fluid. I am ignorant on NF training so if I am wrong, someone
just tell me but don't try to debate it, I have nothing to debate that with.

_________________________________________________________
Do You Yahoo!?
Get your free @yahoo.com address at http://mail.yahoo.com



This archive was generated by hypermail 2.1.4 : 2002-07-27 12:28:40 BST