27-II-1995.

To work, over Bački Breg.

The transactional tables being long ago layered per patients' years of birth, back then when the altos crashed daily, it seems that only today (28th) I wrote a centralized routine to open them. The matter being convoluted as it was, it better be convoluted in only one place.

There's the comments from that piece of code:

called from several places around the app

open qamfo and qambd which fit the current JMBG

year of birth, extracted from jmbg, part of the name of tables to open

the names of main big and main small for the year, preserved for later use

there's no directory control, .\dossz\ is made manually during installation

if it doesn't exist, copy the structure and indexes from template

alias forever, qamfo and qambd (Qmulative :)

The whole trick was in having one instance of each of those two transactional tables, but wrote nothing into them. The standard generated (once, then poked manually) piece of code which would open them did exist, but did nothing. It was unosbe.prg which opened the tables, and each machine had a copy of these tables on its C: disk, and only in the processing of a record did a copy of it get into the corresponding generational table on the server. All because of one Novell's bug... I could have switched back to how it once was, but let it so, it's safer this way and there's fewer clashes over record locking. We actually never got any messages about locking. For a system with fourty-some workstation at its max, not a small feat.

By the end of the day (last file 2:15, after which Joška and I had another cigi and went slowly to bed) I worked on some reports, about waiting lists and work plans. There's no plan, it's about what already happened. Perhaps it served some planning and thus earned its name.

The next morning, some addition to that, and then nothing for the rest of the day. Perhaps I was at Gemenc, who'd know that now. Then I resumed work on rebfpt.prg, which I continued until 3rd of march. Screwy. Luckily, we had an official version of fox, with all the books that came with it, and somewhere in there was a detailed structure of everything, including the .fpt files, where memos are stored, so I knew what I was doing. Actually doing it was a different matter... because the goals have expanded: if a .fpt is screwed up, there's also a good chance that .dbf is too, so better write a regenerator for that too. Because the (header of the) table is the source of information on its own structure, i.e. fields, and if it crashed it can't be, so I took that information from a healthy table and stored it in the generated code. The code would create a new table with that stored structure, and would fill it with the data from the crashed table, as much as it can. It would drag out what it could as much as possible, though in the cases when Fata fucked (i.e. FAT table is screwed) and some blocks in the middle were missing, the records would begin at wrong offsets for the rest of it, and everything would be shifted, unusable.

On 1st of march, the sending routine (written two weeks ago, and will be picked a few more times until november) - to a different doctor, test, lab etc. It's already complicated in real life, there are various cases, but then in the end I made the code simple and clean. Liked that.

Fourth (saturday), at home, did something for zzzzz in the morning, something for MXM in the afternoon, about procurement, i.e. the tractor weigh station. The scale was the first such device that we ever connected with a PC, and I was really wondering how that would go. It turned out to be a simple serial connection, which worked by running a smallish executable, less than 1K, which would output the current weight in kilograms. I redirected the output into a text file, ran the exe invisibly somehow (don't remember how that was done under Dos) and the app would then read that file into a string, converted that into a number and we were done. Actually, one more step: delete the file, so next time a new file would be made, and if it wasn't made it would signal an error and not use the old file from last time.

Just like any vehicle load weighing, it was weighed twice - once incoming, once after unloading (or loading), and the difference would count. They also measured the humidity - someone would climb on the trailer, scoop a measure (of wheat, corn, sunflower, whatever it was), poke the meter into it and read the percentage from its screen. It was recalculating the resistance into percentage - it had two prongs with metal tips, so the distance between them was fixed. The guy would also, by method of exact eyeballing, gauge the grain quality, that would also be entered, and there, the app did its job.

At least the building of the weigh station was brand new, so while it looked the same as any other, it was at least clean, so was the barn - big roof with two walls against the prevailing winds, enter from the front, exit left, so we didn't have the standard problems with damp, dusty air or mice, for that matter.


Mentions: altos, fox, Gemenc Polyclinic, Joška Apro, MXM, rebfpt.prg, zzzzz, in serbian