Amiga disassembly
Re: Amiga disassembly
I'm finding some strange stuff... The program seems to have a bunch of routines that play some sort of audio through channel 1, which seems to be related to the serial routines. Maybe it's just a diagnostic thing (i.e. it sends audio if it's receiving data), or maybe something much stranger like, they had some sort of repeater where if you had multiple Prevue machines in one place, you could hook up the audio from one to the demod of another, and avoid hooking up a million demods to a million satellite receiver outputs (I'm thinking Primestar here - I know that they used LaserDisc machines for most of their guides, although the primary one probably used satellite).
Re: Amiga disassembly
Making some progress. I found the code that handles key presses in grid mode, which really helps, as the functions that they lead to make a lot of sense and help me figure out where in memory things are stored. I had assumed that it would handle it the way the Atari code does, where it compares the key pressed to something in a register (i.e. "was the R key pressed? if not, was the S key pressed?") but instead it takes the byte value of they key pressed and just keeps subtracting one from it, and checking whether or not the value has reached zero yet.
I also found the serial handling code, which appears to work in a similar way, but it is much more complicated, so I'll have to delve into that a little bit more.
I also found the serial handling code, which appears to work in a similar way, but it is much more complicated, so I'll have to delve into that a little bit more.
Re: Amiga disassembly
I found the code that handles the mode K serial command! The good news is, the command is still 8 bytes long just like on the Atari. In order for the command to result in a "CErr":
I'm going to bed now, but tomorrow I may figure out how to get time commands to work on the Amiga.
- the first byte would be greater than or equal to 7 (or negative)
the second byte (byte 1) would be greater than or equal to $C (or negative)
the seventh byte (byte 6) would be greater than or equal to $4C
I'm going to bed now, but tomorrow I may figure out how to get time commands to work on the Amiga.
Re: Amiga disassembly
Well....AriX wrote:I found the code that handles the mode K serial command! The good news is, the command is still 8 bytes long just like on the Atari. In order for the command to result in a "CErr":Otherwise, the command seems to be considered successful.
- the first byte would be greater than or equal to 7 (or negative)
the second byte (byte 1) would be greater than or equal to $C (or negative)
the seventh byte (byte 6) would be greater than or equal to $4C
I'm going to bed now, but tomorrow I may figure out how to get time commands to work on the Amiga.
I guessed the length was 8 bytes (I think I posted before) because the Cerrs only ticks over after 8 bytes are sent. If you try and send 7 or 9 for example, it fails checksum.
byte 1 is day of week so only values 0-6 valid (checks out)
byte 2 is month so only values 0-B valid (checks out)
byte 7 is secs so only values 0-3B valid (hmmm not quite 4C but not enough for it to be valid for that byte to contain something completely different)
I would have thought there might be other restrictions like 5 (hours) and 6 (minutes)? I have suspected it validates what's in 8 and 9 as well, but haven't been able to discover what they are from atari code. That's just speculation on my part though.
Anyway what you are doing is very interesting and hopefully some serious progress!!
Re: Amiga disassembly
Got it! So it turns out, in addition to the conditions I mentioned above, two other things can result in a CErr. One is the state of some location in memory that I don't understand yet, and the other is the value of "ClockCmd." I'm sure you've noticed that when you are in an ad and press the "F" key, you will see a bunch of diagnostics, one of which is "ClockCmd." It turns out that if the ClockCmd is anything but "2," the K command will be dismissed and count as an error.
In the copy of the software we have, ClockCmd is 1. At first, I thought that maybe the ClockCmd could be changed by creating a clock.cmd file, since the software seems to look for that file a few times. No dice. Eventually, I found that the value for ClockCmd, along with a bunch of other variables, is found in the file "config.dat." Once unpacked, you may see that the last byte of the file is $31, or 1. If you change that to $32 (2), and restart the software, ClockCmd will read "2," and sending a K command will magically work!
The first time I did it, though, it didn't work at all. tin, I don't know how you've been calculating your year values in the K command, but the way I did it was I took the last two digits of the year (for example, 11), converted them into hex, and included them as the year value. This works fine on the Atari, but on the Amiga, it always sets the date to January 1st, 1970, at 1:27:29 AM. It seems that this value should in fact be stored as the years since 1900 - so, currently $6F (111 years since 1900). Other than that, the Amiga mode K format is the same as the Atari. The Amiga, however, calculates days of the week based on the year, month, and day of the month, instead of using the day of the week value (the first byte).
Now, why would the K command be disabled by default? Well, I don't quite know, but I would guess that there is a newer version of the date command under some other letter that they used instead for some reason. The guy who wrote the original software told me specifically that someone had changed the date command because of the Y2K craziness that was going on at the time, despite the fact that the old version of the software was technically Y2K compliant (up to 2155, at least). I don't know why they would disable it in this fashion, but hopefully at some point I'll figure out the new version of the command.
Lastly, if you don't want to change the config.dat file yourself, today I also figured out how to use the H mode (file download), so here are commands that will send over the updated config.dat. You'll need to reboot (or send a reset command) for it to take effect.
In the copy of the software we have, ClockCmd is 1. At first, I thought that maybe the ClockCmd could be changed by creating a clock.cmd file, since the software seems to look for that file a few times. No dice. Eventually, I found that the value for ClockCmd, along with a bunch of other variables, is found in the file "config.dat." Once unpacked, you may see that the last byte of the file is $31, or 1. If you change that to $32 (2), and restart the software, ClockCmd will read "2," and sending a K command will magically work!
The first time I did it, though, it didn't work at all. tin, I don't know how you've been calculating your year values in the K command, but the way I did it was I took the last two digits of the year (for example, 11), converted them into hex, and included them as the year value. This works fine on the Atari, but on the Amiga, it always sets the date to January 1st, 1970, at 1:27:29 AM. It seems that this value should in fact be stored as the years since 1900 - so, currently $6F (111 years since 1900). Other than that, the Amiga mode K format is the same as the Atari. The Amiga, however, calculates days of the week based on the year, month, and day of the month, instead of using the day of the week value (the first byte).
Now, why would the K command be disabled by default? Well, I don't quite know, but I would guess that there is a newer version of the date command under some other letter that they used instead for some reason. The guy who wrote the original software told me specifically that someone had changed the date command because of the Y2K craziness that was going on at the time, despite the fact that the old version of the software was technically Y2K compliant (up to 2155, at least). I don't know why they would disable it in this fashion, but hopefully at some point I'll figure out the new version of the command.
Lastly, if you don't want to change the config.dat file yourself, today I also figured out how to use the H mode (file download), so here are commands that will send over the updated config.dat. You'll need to reboot (or send a reset command) for it to take effect.
Code: Select all
55AABBBB00FF0D0A55AA412A41322A00CD0D0A55AA484446303A636F6E6669672E64617400EA0D0A55AA4800343243303130383038474E414530314E4E4E4E4E4E4C32393036595959323333363036303135313030594E59438E384E4E4E4E4E3227000D0A55AA480100B60D0A55AABBBB00FF0D0A
Re: Amiga disassembly
If anyone is wondering about the H command, here's how it works:
First, obviously, you have to use the A command to get the software to listen. Then:
Next, you start sending the actual data:
If you want to send more data, you can continue to send more packets. Simply increment the packet number every time you send data and it will be appended onto the data to be written. You can also send the same packet multiple times; it seems that UV actually sent each packet twice for redundancy: if the first one wasn't received properly, the second one would be used instead (based on the checksum).
Once you're done sending your packets, you must send one final, null packet to get the file to write out:
I would guess that the software can support up to 255 bytes per packet, but that they actually only used 80 for the redundancy. If you use 80 bytes per packet, you can write up to 20.4 kB to a file. If you use 255 bytes per packet, you can write up to 65.025 kB/file.
One final note: all data is PowerPacked before being written to disk.
First, obviously, you have to use the A command to get the software to listen. Then:
Code: Select all
55 AA 48 (48 is hex for H) 44 46 30 3A 63 6F 6E 66 69 67 2E 64 61 74 (this is hex for "DF0:config.dat"; basically just the path of the file to save) 00 EA (00 and then the checksum)
Code: Select all
55 AA 48 (the header for the H command again) 00 (the "packet number" of the data you're sending) 34 (The number of bytes of data to follow; possibly limited to 80 bytes. If this value is wrong, the scroll will stop and the software will show a scary warning.) 32 43 30 31 30 38 30 38 47 4E 41 45 30 31 4E 4E 4E 4E 4E 4E 4C 32 39 30 36 59 59 59 32 33 33 36 30 36 30 31 35 31 30 30 59 4E 59 43 8E 38 4E 4E 4E 4E 4E 32 (the actual data to send, in this case $34 bytes) 27 00 (the checksum; note that for some reason the checksum comes BEFORE the 00 in this instance instead of after)
Once you're done sending your packets, you must send one final, null packet to get the file to write out:
Code: Select all
55 AA 48 01 00 B6 (here I use packet number $01 because I only sent one packet so far)
One final note: all data is PowerPacked before being written to disk.
Re: Amiga disassembly
cool, there is obviously something more to this than we currently understand.AriX wrote:Got it! So it turns out, in addition to the conditions I mentioned above, two other things can result in a CErr. One is the state of some location in memory that I don't understand yet, and the other is the value of "ClockCmd." I'm sure you've noticed that when you are in an ad and press the "F" key, you will see a bunch of diagnostics, one of which is "ClockCmd." It turns out that if the ClockCmd is anything but "2," the K command will be dismissed and count as an error.
Now, my memory of this might be quite rusty as it was about a year ago now that I did the disassembly, but I am reasonably sure the Atari code only works out the jday from the clock information. It uses the year to work of if it's a leap year or not to help in working out the jday. The day/date is not otherwise displayed on screen (unlike the Amiga). Because of this then probably I made a wrong assumption for calculating the value for year because it doesn't really matter in Atari apart from the jday will be one out after february if the code thinks it's a leap year when it's not. Indeed the Atari initialises to $56 (=1986?), which ties with your years since 1900 thing. The atari code just checks the last two binary digits are 00 to ensure it's a leap year, which, for 1900 + every 4 years, it will be (almost, it will go wrong on year 2100). It otherwise doesn't care what year it is.AriX wrote:The first time I did it, though, it didn't work at all. tin, I don't know how you've been calculating your year values in the K command, but the way I did it was I took the last two digits of the year (for example, 11), converted them into hex, and included them as the year value. This works fine on the Atari, but on the Amiga, it always sets the date to January 1st, 1970, at 1:27:29 AM. It seems that this value should in fact be stored as the years since 1900 - so, currently $6F (111 years since 1900). Other than that, the Amiga mode K format is the same as the Atari. The Amiga, however, calculates days of the week based on the year, month, and day of the month, instead of using the day of the week value (the first byte).
After looking at the year thing I really can't work that out. I thought it might be due to differences between the two, but it seems not. I did speculate that the updated amiga command would be "k" as that seems to be the pattern for updated commands.AriX wrote:Now, why would the K command be disabled by default? Well, I don't quite know, but I would guess that there is a newer version of the date command under some other letter that they used instead for some reason. The guy who wrote the original software told me specifically that someone had changed the date command because of the Y2K craziness that was going on at the time, despite the fact that the old version of the software was technically Y2K compliant (up to 2155, at least). I don't know why they would disable it in this fashion, but hopefully at some point I'll figure out the new version of the command.
Re: Amiga disassembly
Ah, so that's what my friend meant when he said that the old system was Y2K-compliant until 2099!tin wrote:The atari code just checks the last two binary digits are 00 to ensure it's a leap year, which, for 1900 + every 4 years, it will be (almost, it will go wrong on year 2100). It otherwise doesn't care what year it is.
You're probably right! I haven't yet completely figured out how the serial handling works, but I will let you know when I do! I will specifically check for the lowercase k - I agree with your lowercase command theory.tin wrote:After looking at the year thing I really can't work that out. I thought it might be due to differences between the two, but it seems not. I did speculate that the updated amiga command would be "k" as that seems to be the pattern for updated commands.
Meanwhile, I've finally found the CTRL buffer handling routines, and I'm currently working through that. I re-read through the "110-baud bit-banging" thread, where you posted some helpful information on how things work under the hood, which really helped me out.
Re: Amiga disassembly
Here is the disassembly (heavily commented by me) of the function which I have named readCTRL: http://prevueguide.com/readCTRL.asm.txt
I translated it into C to the best of my ability to try and get it to make more sense: http://prevueguide.com/readCTRL.txt
If I'm understanding it correctly, it does seem like it works in a very interesting way. The function reads in one bit to a "bit buffer," which stores each bit as a full byte (containing $FF or $00) for some reason. [html]<strike>Once a certain amount of bytes have been received (seems to be 94 bits, which, minus some amount of start/stop bits, comes out to about 10 bytes or something?), it moves everything from the bit buffer to the main ring buffer, which is 500 bytes ($1F4) long. It also seems to require something crazy in the way of start and stop bits (2 or 4 start bits or something?) - but one set of start bits seems to apply to 10 or 11 bytes. That is, instead of sending stuff like a real serial port, where you send start bit(s), the byte, and then stop bit(s), this may be you send start bit(s), a whole bunch of bytes, and THEN stop bit(s). I'm not sure what the function of this is, as I haven't yet started to analyze the function that parses the main ring buffer</strike>[/html]. I feel like I'm much closer to getting it working, as some of it is actually starting to come together...
I translated it into C to the best of my ability to try and get it to make more sense: http://prevueguide.com/readCTRL.txt
If I'm understanding it correctly, it does seem like it works in a very interesting way. The function reads in one bit to a "bit buffer," which stores each bit as a full byte (containing $FF or $00) for some reason. [html]<strike>Once a certain amount of bytes have been received (seems to be 94 bits, which, minus some amount of start/stop bits, comes out to about 10 bytes or something?), it moves everything from the bit buffer to the main ring buffer, which is 500 bytes ($1F4) long. It also seems to require something crazy in the way of start and stop bits (2 or 4 start bits or something?) - but one set of start bits seems to apply to 10 or 11 bytes. That is, instead of sending stuff like a real serial port, where you send start bit(s), the byte, and then stop bit(s), this may be you send start bit(s), a whole bunch of bytes, and THEN stop bit(s). I'm not sure what the function of this is, as I haven't yet started to analyze the function that parses the main ring buffer</strike>[/html]. I feel like I'm much closer to getting it working, as some of it is actually starting to come together...
Re: Amiga disassembly
Okay, so I don't quite have the signed/unsigned integer thing down, but part of what I said in my previous post was grossly and ridiculously untrue. How it actually works is more straightforward:
1 short start bit
1 long start bit (bit only needs to be 1 at the beginning, because no data is read afterwards for several units of time)
8 data bits
1 flexible length stop bit
Now, the weird part is the difference in duration of the start/stop bits. Both start bits are 1 and both stop bits are 0, but the first start bit is expected to remain for 3 units of time, while the second is expected to remain for 10 units of time. Each data bit is then expected to remain for 10 units of time, and then the stop bit is flexible: it must remain for at least 1 unit of time, but can continue after that for any amount of time; data is not read again until the start bits are detected.
So perhaps a more accurate way of writing this would be:
1 extra long start bit (lasts 13 time units; bit only needs to be 1 (high) for the first 4 time units out of the 13)
8 data bits (10 time units each)
1 stop bit
Fortunately, this makes a lot of sense given the behavior we've observed with the real Amigas when using the CTRL data:
- if CTS is off, start bits will never be detected, and all data will be discarded
- if CTS is constantly on, start bits will be detected, and all data bits will be read as 1 - but then, when it comes time for the stop bit, the bit will be read as 1, and all data will be discarded and not read into the main buffer
- if CTS is turned on and then off, the same as above will happen, but it is likely that the CTS will be turned off during the data bit time or the stop bit time, causing the start bit to be read as 1 and the stop bit to be read as 0, which is all we need for a byte to be read
In plainer terms, the H that previously seemed to increment every time a bit was sent is actually incrementing every time it receives what it thinks is a byte, because anything that starts with 1 and ends with 0 is a byte to it.
If this whole thing made no sense, feel free to ask me to explain it better. I don't know the correct terminology for this stuff.
1 short start bit
1 long start bit (bit only needs to be 1 at the beginning, because no data is read afterwards for several units of time)
8 data bits
1 flexible length stop bit
Now, the weird part is the difference in duration of the start/stop bits. Both start bits are 1 and both stop bits are 0, but the first start bit is expected to remain for 3 units of time, while the second is expected to remain for 10 units of time. Each data bit is then expected to remain for 10 units of time, and then the stop bit is flexible: it must remain for at least 1 unit of time, but can continue after that for any amount of time; data is not read again until the start bits are detected.
So perhaps a more accurate way of writing this would be:
1 extra long start bit (lasts 13 time units; bit only needs to be 1 (high) for the first 4 time units out of the 13)
8 data bits (10 time units each)
1 stop bit
Fortunately, this makes a lot of sense given the behavior we've observed with the real Amigas when using the CTRL data:
- if CTS is off, start bits will never be detected, and all data will be discarded
- if CTS is constantly on, start bits will be detected, and all data bits will be read as 1 - but then, when it comes time for the stop bit, the bit will be read as 1, and all data will be discarded and not read into the main buffer
- if CTS is turned on and then off, the same as above will happen, but it is likely that the CTS will be turned off during the data bit time or the stop bit time, causing the start bit to be read as 1 and the stop bit to be read as 0, which is all we need for a byte to be read
In plainer terms, the H that previously seemed to increment every time a bit was sent is actually incrementing every time it receives what it thinks is a byte, because anything that starts with 1 and ends with 0 is a byte to it.
If this whole thing made no sense, feel free to ask me to explain it better. I don't know the correct terminology for this stuff.