Skip to main content

This was a timely topic.  I just got a command base in for evaluation.  I actually got the base, CAB2, and a locomotive.  He wanted them checked out.

 

Turns out the base was the issue.

 

Here's my Legacy base.

 

ADS00002

 

Here's the problem base.

 

ADS00003

 

I also wonder what the significance of the different frequency being displayed in the measurement panel, any hints there Dale?  That was consistent and didn't really change.  Obviously the much lower amplitude was a real issue, and it would barely run a locomotive in fits and starts.

Attachments

Images (2)
  • ADS00002
  • ADS00003
Originally Posted by gunrunnerjohn:

This was a timely topic.  I just got a command base in for evaluation.  I actually got the base, CAB2, and a locomotive.  He wanted them checked out.

 

Turns out the base was the issue.

 

Here's my Legacy base.

 

ADS00002

 

Here's the problem base.

 

ADS00003

 

I also wonder what the significance of the different frequency being displayed in the measurement panel, any hints there Dale?  That was consistent and didn't really change.  Obviously the much lower amplitude was a real issue, and it would barely run a locomotive in fits and starts.

What is the f= frequency? Different than the Freq number in the right panel.

I think that's something to do with sampling, I'm not sure.  I'll have to read my manual again.  However, the frequency in the lower right on the measurement panel was what I was talking about.  OTOH, maybe that is something different and the f on the display is the measured frequency.  The 'scope gives you so much information, it's hard to sort it all out.

 

I'm sure the basic issue is the amplitude, I was surprised to be able to use that information so soon.

 

 

The "idling" signal is constantly being modulated with a "no op" signal that keeps the locomotives alive.  Maybe the scope just captured different segments of the FM signal, once with the lower frequency, and the other with the higher frequency.

With a conventional counter set to a relatively long sampling period, I read 454.88 kHz according to my notes.

The signal is generated by one of the digital chips and then shaped into the sinewave by some LC filters.  Seems like the frequency should be quite accurate since the clock for the system is a 34.35 MHz crystal, but I guess the filtering could be off frequency enough to attenuate the signal.

My (incomplete) notes indicate that the digital version of the signal passes through a  double LC filter (L1 & L2 in series, C's shunt to ground), then to the emitter follower TR5 that has a 100 ohm emitter resistor R3, and then through some L's and C's to the output pin (that aren't on the TMCC Base's output).  Plenty of opportunities for a bad component.

Check the base of TR5, and I think there is also a test point after the first LC filter L? and C2 after the emitter follower.  Moving on to the next PC card, C1 is the output coupling capacitor that was damaged on my Base.  L12 and L17 carry this signal to the stud terminal.

Let us know what you find.

I think it's just my unfamiliarity with the features of the 'scope.  I'm not sure what the frequency on the right bar is.  It's interesting to note that it's different for two Legacy bases, especially since one has a problem.

 

I'm not going to take it apart, we'll send it back to Lionel.

 

I never thought of connecting the standard counter to it, and it's right at eye-level on the bench!

 

 

Last edited by gunrunnerjohn
Originally Posted by gunrunnerjohn:

An excellent idea Dale, I'll put some ID on the bottom inside the "legs".

 

I plan on measuring any of them I get my hands on, it's easy enough to do, and certainly shows that size matters!

 

The fellow who had the low output base I checked told them specifically that it had low output. That seemed to work because the base he got back was fixed or exchanged.

I'm not a TMCC user so this question is simply out of curiosity.  But earlier it was proposed to use the meter output voltage of the R2LC FM radio chip for the traveling signal strength version.  Just looking at the meter output specs, these chips have an operating dynamic range of well over 50 dB for reliable FM operation.  Yet, in GRJ's scope photo, the difference between a "good" base and a "bad" base is only about 5 dB.

 

Does TMCC operate on the lower limit of the radio receiver chip's sensitivity such that a 5 dB difference in amplitude puts it over the edge?   Or is the reduction in amplitude a symptom of something else such as upsetting the command signal modulation itself...rather than just its amplitude?

 

Edit: in case it's not obvious, the point of this goes to the feasibility of using the meter output pin. That is, the sensitivity or volts-per-dB of these logarithmic circuits substantially varies chip-to-chip.  If the intended use is to compare one traveling-car reading to another, some form of calibration or normalization is required.  OTOH, if it's just one traveling signal strength car making the rounds (remember the OGR traveling boxcar?), I suppose relative readings could work considering you apparently need to resolve differences of a few dB.

Last edited by stan2004

That's a good point Stan, and I don't know the answer for sure.  I'm guessing that the fact that the amplitude is about 1/2 for an unloaded base means that the drive circuit has a real issue driving a capacitive load such as the layout.  Since the failing unit is boxed up to go back, the experiments will have to wait for another bad Legacy unit.

 

The failed base managed to startup the Legacy engine, and even got it moving.  However, control was erratic, even though it never dropped out of command mode.  It just ran at the same throttle setting halfway around the test loop until it responded to the remote again.

I get 454.8859 with an HP hi stab time base. 10 second gate. And 454.880 with a Rigol scope. The scope has what it calls a "counter"...probably a 1 sec gate. The manual was written by MS I think. Don't know it's stated accuracy, but useful enough. The HP has always been good to 1 part in 10^9 or better. If you just look at the screen where I think the Rigol measures the period of a cycle(s). I get 452.xxx to 456.xxx. I can see the FM.

Dale, John, Chuck -

 

I again connected my scope across the 10K resistor as I had done before (correct way) and with the track connected to the base, the trace was as before - no change - so far so good.

 

I turned on the remote and started one of the Legacy engines that was on the tracks, blew the horn several times, and then shut it down. The engine responded fine, but I could not see any evidence of any signal on the scope - just the smooth 455KHz signal as before. Should I be able to see something or are the command signals short bursts that cannot be seen on the scope?

 

Thx

 

Alex

In the absence of a technical spec, you can infer how much the frequency should be changing by looking at the R2LC receiver board.  On the lower right corner of my previous picture is the black 455 kHz filter with the letter "F" on it which indicates its bandwidth.  The "F" filter has a 3dB bandwidth of +/-4.2 kHz.  This suggests even when commands are being sent, the frequency is changing less than 1%.

 

So in the case of a typical scope screen with 10 horizontal scale divisions, whether looking at one cycle or many cycles, a 1% shift is no more than 1/10th of a scale division which can be hard to resolve.

Originally Posted by gunrunnerjohn:

I know that Chuck, my question is why it would idle at one or the other with no command inputs.  I would assume the base would always have one "idle" bit output.

Oh. Well, maybe someone didn't. Anyway I figure it idles like a model 19 but maybe your scope measures the period of one one cycle and interprets that as the frequency?

GRJ, here's a job for you - I'm imagining you hooking up your scope to the R2LC radio chip's detector output (pin 9).  That voltage goes up/down proportional to frequency and then you can see how the transmitted frequency varies when idle vs. when a command is active.

 

And I get to say it first this time:

 

Nothing is so easy as the job you imagine someone else doing!

Originally Posted by cjack:
Originally Posted by gunrunnerjohn:

I know that Chuck, my question is why it would idle at one or the other with no command inputs.  I would assume the base would always have one "idle" bit output.

Oh. Well, maybe someone didn't. Anyway I figure it idles like a model 19 but maybe your scope measures the period of one one cycle and interprets that as the frequency?

cjack is correct- the processor pin driving the modulator is either 0 or 1; there is no idle.  And the base is always transmitting a NOP in lieu of any command, again no idle state.  

Last edited by SantaFeFan

Wanted to comment about signal level at the base "U" terminal.  The level is checked at the factory for a minimum of 4V Peak to Peak.  Most bases are around 5v P-P. Some are as high as 6v P-P. 

 

The radio is checked at much less than 4V P-P in the design phases.  Since the radio is looking for microvolts, most of the signal issues folks see are not so related to base output levels.  Although the base should put out the proper levels for optimum loco and accessory operation!  Also Legacy bases are crystal controlled well within the tolerance of the radio in the loco.  It is my experience that a Legacy base cannot be off frequency by even a small amount.  The processor checks timing at startup and stops the base from operating if off frequency. 

Last edited by SantaFeFan

Jon, you have one coming back from Henning's that is failing to communicate with locomotives, the only thing I could see is a lower voltage of around 3V out of it.  It would barely control a Legacy engine, and frequently fail to control it totally as it ran.  The TMCC signal seems to be sensed as it doesn't drop into conventional mode, but it doesn't control it properly either.

 

If there is no idle ( clearly you should know ), I'm curious why I had two bases that seemed to always measure either a high or low frequency consistently.  Was that just a measurement fluke?

John, just a thought about the 'scope measurements.  The two frequencies passing through the filtering can cause an AM component, with one frequency coming through stronger than the other.  Perhaps your scope triggered on the larger of the two.  With the tolerances of the filter components, you probably can't know which frequency will be larger.

With a voltage spread of 4-6 volts, this would seem to indicate that there is a fair amount of component value tolerance since the signal starts as a fixed amplitude square wave and gets shaped into the sinewave by the filters before and after the output stage (mostly before).  Since there are at least half a dozen frequency-tuned L's and C's, it wouldn't be hard to get off peak.

Edited to correct "would" to "wouldn't"

Last edited by Dale Manquen
Originally Posted by gunrunnerjohn:

If there is no idle ( clearly you should know ), I'm curious why I had two bases that seemed to always measure either a high or low frequency consistently.  Was that just a measurement fluke?

As discussed in your other TMCC thread:

 

https://ogrforum.com/t...ounds-command-format

 

the bit rate out of an R2LC is 3000 baud.  So if the carrier is 455 kHz, each bit interval contains about 150 cycles.  In your scope shots above you capture about 7 cycles.  So in a classic example of "do the math" the probability that any scope screen captures cycles that are completely within one bit time is extremely high.  Hence, that's why you only seem to capture one or the other frequency...in my opinion.

 

Separately, and this is more speculative since I have no idea how your scope measures frequency, but you showed two frequencies of 452.0 and 458.7 kHz.  Let's assume these are accurate measurements of the two frequencies.  The average of these is 455.4 kHz which makes sense.  There was some discussion about that "other" frequency measurement on your screen of 454.86 or 454.88 kHz.  I wonder if this represents some kind of average of many screens?  In other words it seems to be about mid-point of the two frequencies.  As to why it doesn't "average" to 455.4 kHz might be explained by your scope screen below from the other thread.  That is, if a "1" is frequency A and a "0" is frequency B, it appears the signal spends more than half the time at frequency A.  This bias was also true for the other command examples you gave.  This might explain why the average frequency is not exactly the mid-point of frequency A and frequency B.

 

Bell

Attachments

Images (1)
  • Bell

Add Reply

Post

OGR Publishing, Inc., 1310 Eastside Centre Ct, Ste 6, Mountain Home, AR 72653
800-980-OGRR (6477)
www.ogaugerr.com

×
×
×
×
Link copied to your clipboard.
×
×