Skip to main content

Hey Dave. I'm extremely excited to play with this. What an amazing resource you've provided.

Love the idea of a rest API. A couple unsolicited thoughts (please take with a grain of salt, I know you are under no obligation to share your time any more than you already have)

  • I wonder though if integration code like Alexa etc should live outside the package. Just thinking about how to keep it easier to maintain over the long term. There might be many other integrations as well in the future that could leverage the HTTP API.
  • Given the thoughtfulness already put into this library, do you imagine a DCS control implementation could also live here? It seems what you are creating here could become THE go to library for Lionel & MTH control.
  • Naming: I noticed there are already some pypi packages using pylegacy and pytrain etc. ocontrol? Whatever the case, would be cool for "seo" and marketing purposes if the repo and package name were the same.


P.S. have you seen this bluetooth control python library for lionchief?

I can't think of a use case but I'd certainly think of a use with Home Assistant integration.

Actually I did think of use cases, e.g. blow whistle when motion detected near train room door
Stop train when motion/presence no longer detected
X number of bell hits dims train room lights

And finally, zigbee devices can (such as a simple button) can activate things on the train. Easy to set up interactions for kids/adults alike

https://www.home-assistant.io/

@kwilliamm posted:

Hey Dave. I'm extremely excited to play with this. What an amazing resource you've provided.

Love the idea of a rest API. A couple unsolicited thoughts (please take with a grain of salt, I know you are under no obligation to share your time any more than you already have)

  • I wonder though if integration code like Alexa etc should live outside the package. Just thinking about how to keep it easier to maintain over the long term. There might be many other integrations as well in the future that could leverage the HTTP API.
  • Given the thoughtfulness already put into this library, do you imagine a DCS control implementation could also live here? It seems what you are creating here could become THE go to library for Lionel & MTH control.
  • Naming: I noticed there are already some pypi packages using pylegacy and pytrain etc. ocontrol? Whatever the case, would be cool for "seo" and marketing purposes if the repo and package name were the same.


P.S. have you seen this bluetooth control python library for lionchief?

@kwilliamm

As Homer might say, "doh!"  I had not checked pypi Until right now, and yes, pytrain is already taken. Kind of makes sense, given all the machine learning interest out there. I will add some kind of suffix that makes sense, at least to me. Although suggestions will be welcome!

one of the reasons I took on packaging is that I wanted to keep the restful API and Alexa integration in a separate project and not have the overhead of it in the main py train. I definitely agree. They should be separate projects and they will be. I'm still debating if I wanna combine the restful API and Alexa into one or even break those out separately. Do you have an opinion on that?

I have no experience with DCS. I'm assuming this is what MTA uses? I think I remember hearing somewhere that MTH was very proprietary and actually went after someone that tried to reverse engineer their protocol. I don't have any of their equipment, nor any of their engines, so at least for the time being, i'm not gonna take that on.

Thanks!

  -- Dave

@cdswindell posted:

I have no experience with DCS. I'm assuming this is what MTA uses? I think I remember hearing somewhere that MTH was very proprietary and actually went after someone that tried to reverse engineer their protocol. I don't have any of their equipment, nor any of their engines, so at least for the time being, i'm not gonna take that on.

Mark DiVecchio has done a lot of work with DCS and has decoded their protocol very well.  Check out his RTC page: http://www.silogic.com/trains/RTC_Running.html

This is a pretty impressive program and he does it all via the WiFi.  He has extensive information on the actual protocol to talk to the DCS as well.

@Rollsington posted:

I can't think of a use case but I'd certainly think of a use with Home Assistant integration.

Actually I did think of use cases, e.g. blow whistle when motion detected near train room door
Stop train when motion/presence no longer detected
X number of bell hits dims train room lights

And finally, zigbee devices can (such as a simple button) can activate things on the train. Easy to set up interactions for kids/adults alike

https://www.home-assistant.io/

The way Google and Alexa smart home devices work is by making use of apis of the client applications to control them. Thus, I need to build this into PyTrain in order to build the Alexa integration.

  — Dave

@kwilliamm posted:

Hey Dave. I'm extremely excited to play with this. What an amazing resource you've provided.

Love the idea of a rest API. A couple unsolicited thoughts (please take with a grain of salt, I know you are under no obligation to share your time any more than you already have)

  • I wonder though if integration code like Alexa etc should live outside the package. Just thinking about how to keep it easier to maintain over the long term. There might be many other integrations as well in the future that could leverage the HTTP API.
  • Given the thoughtfulness already put into this library, do you imagine a DCS control implementation could also live here? It seems what you are creating here could become THE go to library for Lionel & MTH control.
  • Naming: I noticed there are already some pypi packages using pylegacy and pytrain etc. ocontrol? Whatever the case, would be cool for "seo" and marketing purposes if the repo and package name were the same.


P.S. have you seen this bluetooth control python library for lionchief?

I think I’ll name the package pytrainrr or pytrain-rr.

  — Dave

Dave, all the work you guys are doing with these protocols makes you wonder why Lionel and MTH can't even make a remote.

Any company needs to prioritize initiatives based on many factors. One would need to accept a physical remote that replicates the Cab-2 and now app didn't make the cut.

If you believe you can create one that is cost effective, I'm sure there are folks that would buy it. Let us know when we can order one. I'd like it to be under $200 and perform all the functions of the Cab-2 and smartphone apps that go through the Base3 or direct to equipment that supports BT, etc.

@David_NJ posted:

If you believe you can create one that is cost effective, I'm sure there are folks that would buy it. Let us know when we can order one. I'd like it to be under $200 and perform all the functions of the Cab-2 and smartphone apps that go through the Base3 or direct to equipment that supports BT, etc.

Nothing like unrealistic expectations!

Just added the ability to "replay" a set of commands at start-up. With this feature, and tools already built into the OS of the Raspberry Pi and MacOS (and probably Windows as well) you can:

  • Activate an accessory at a specific time of day
  • Start and stop trains based on some external event
  • Turn on your layout lights at sundown and off at sunrise (if you pulled an all nighter )
  • Blow the horns of all active trains at noon
  • and many other...

The PyTrain startup option to do this is -replay followed by the name of a text file containing the commands to run at startup. The command can be run from a PyTrain client or server.

  -- Dave

John, can you come up with the pcb gerber files for a remote or is that someone else's bailiwick?  I've spoken with someone about doing the layering of the code for a touch screen... we are sort of at Cab-1L on steroids or Cab-2 lite to keep it reasonable.

The bigger issue is coming up with a design for the hardware, actually developing the PCB design is just part of the picture.  For something like a TMCC/Legacy remote, the hardware design and debug would be a lot more effort than the actual PCB design.

The bigger issue is coming up with a design for the hardware, actually developing the PCB design is just part of the picture.  For something like a TMCC/Legacy remote, the hardware design and debug would be a lot more effort than the actual PCB design.

What if you used my software as the base? Then you are really just hardware debugging the physical switches, pots, and display...

@cdswindell posted:

What if you used my software as the base? Then you are really just hardware debugging the physical switches, pots, and display...

That would be a giant step forward.  The first steps would be coming up with a form factor.  I certainly think it should be a handheld remote.  We'd have to agree on the actual display and exact configuration of the switches so that the hardware matched the finished software.  Clearly, that can be done, just pointing out there's coordination involved between software and hardware development required for such a project.

We don't want to hijack Dave's thread here.... is it worth a thread to try and winnow it out or is there to much to chew on?

My own 2 cents is there is a fair amount of discussion to be had as to what features a handheld must control as well as what features would be nice to have. At the end of the day, features will just send a bunch of bytes out the wifi interface to the Base 3 (or a PyTrain server so we have the Ser2 feedback). PyTrain knows how to send out all of the published TMCC and Legacy commands, so it can handle any features the discussion arrives at. I think it would be better to have these discussions in a separate thread. I promise to follow it .

Just to stir the pot, here are some things to ponder:

  • Is it acceptable to just support TMCC and Legacy engines (PyTrain doesn't do Bluetooth yet, and many LionChief engines also support Legacy)
  • Must it control turnouts? (requires a dynamic LCD and GUI)
  • Must it fire routes? (requires a dynamic LCD and GUI)
  • Must it operate accessories? (requires a dynamic LCD and GUI)
  • Must it be able to build/configure trains? (Gui work and I need to add a few more commands)
  • Must it allow editing of road names and numbers?
  • Must it allow the addition and configuration of new components (Set address, set command type, etc)

One of the issues with dynamic displays required to implement the more involved GUIs is they may drive the design to something that needs more than a simple 4 line, 20 character LED Matrix display. Inexpensive OLED displays do exist for the Pi, and some even support touch screens. However, this will mean more software. The OLED displays take a few more pins than the LCD screens do (I drive mine off the 2 pin I2C bus on the Pi), which means fewer pins for switches. I do have I2C support for GPIO extenders up and running, but now you need to make room and have power in the handheld for these boards too.

Fun stuff to talk about, and certainly a nice distraction.

  -- Dave

That would be a giant step forward.  The first steps would be coming up with a form factor.  I certainly think it should be a handheld remote.  We'd have to agree on the actual display and exact configuration of the switches so that the hardware matched the finished software.  Clearly, that can be done, just pointing out there's coordination involved between software and hardware development required for such a project.

John, agreed!

@David_NJ posted:

Any company needs to prioritize initiatives based on many factors. One would need to accept a physical remote that replicates the Cab-2 and now app didn't make the cut.

If you believe you can create one that is cost effective, I'm sure there are folks that would buy it. Let us know when we can order one. I'd like it to be under $200 and perform all the functions of the Cab-2 and smartphone apps that go through the Base3 or direct to equipment that supports BT, etc.

How nice of you to be willing to buy one 🤣

Dave, all the work you guys are doing with these protocols makes you wonder why Lionel and MTH can't even make a remote.

Well, software and hardware are two different animals, as you know, and my software only handles 2 protocols (TMCC/Legacy and PDI). And I don't have to answer to management, support legacy (as in older) products, or do any user support (yet 🤣)!

Last edited by cdswindell

I have a very limited understanding of the programing and electronic issues in this project.  However, Dave you are doing a phenomenal job in the development of a remote device to replace or upgrade the cab2 handheld.  I hope this thread may wake up Lionel to support the development of a handheld unit or aid in the progress so far.  Although I have my doubts. 

Marty

What about using a MFI iOS controller (bluetooth video game controller) to send commands to the Raspberry Pi?  It could also hold the phone to show the Cab3 app.  I understand that there may be (a lot of) additional code to incorporate something like this.  However, the gaming controllers are inexpensive, durable, allow a wide range of input types, and can navigate menus and data input rather quickly even without a number pad.  Food for thought for Dave and others in the trenches on this project.  Thanks for making everything public, Dave.  Very fun and exciting stuff.

2025-01-23_10h03_24

Attachments

Images (1)
  • 2025-01-23_10h03_24
@JD2035RR posted:

What about using a MFI iOS controller (bluetooth video game controller) to send commands to the Raspberry Pi?  It could also hold the phone to show the Cab3 app.  I understand that there may be (a lot of) additional code to incorporate something like this.  However, the gaming controllers are inexpensive, durable, allow a wide range of input types, and can navigate menus and data input rather quickly even without a number pad.  Food for thought for Dave and others in the trenches on this project.  Thanks for making everything public, Dave.  Very fun and exciting stuff.

2025-01-23_10h03_24

I must be missing something here, as I'm not sure what this accomplishes. We could run the Cab 3 app on the iPhone already, right? Are you looking to use the joystick on the controller to control engine speed?

Thanks,

  -- Dave

I just checked in a new program, make_service.py, that installs PyTrain as a service on Raspberry Pis. What does this mean and why should you care? One of my goals is to make the Pi as much as "an appliance" as I can. When you turn on the main power to your layout, it is important to me that all of the embedded Pis in the control panels boot up and start running the PyTrain software with no fuss or further configuration. Lionel's Base 2 and Base 3 do the same thing. Even though they are really special purpose computers with custom programming, when you flip on the main power, they just turn on and start working (most of the time 😉)

The way you make this happen on a computer, like the Pi, is to install your program as a system service. This commit adds the program make_service.py , which creates a system service to run PyTrain as either a server or a client. Once this is done, PyTrain will just fire up every time you power on your control panel(s). The client instances will even wait for your PyTrain server to launch, and the server will wait for the Base 3 to boot.

My goal is to have it all just work!

  -- Dave  

While on the subject of system management (I do realize this is a train site , I've also added a number of commands to PyTrain to simplify the management of a PyTrain environment, AKA a bunch of control panels. Those commands are:

  • update: checks for PyTrain updates and installs them as necessary
  • shutdown: does a clean shutdown of all Pis. On my master control panel, I plan to have a big button that I will press and hold for a few seconds before I plan to power down the layout. Although Pis were designed to just have their power cords pulled out, no piece of electronics really likes this...
  • restart: restarts PyTrain on all clients and servers if something is wonky (which means I didn't do my job 😢)
  • reboot: reboots all of the Pis and restarts all services
  • upgrade: updates the Linux operating system on each Pi, then does a PyTrain update.

All of these commands automatically relaunch PyTrain whether it is running as a system service (preferred) or from a shell window (for developers and layout operators that want to use the PyTrain CLI to control their layout).

  -- Dave

Last edited by cdswindell
@cdswindell posted:

I must be missing something here, as I'm not sure what this accomplishes. We could run the Cab 3 app on the iPhone already, right? Are you looking to use the joystick on the controller to control engine speed?

Thanks,

  -- Dave

The idea is to provide a tactile hand held remote (rather than a touch screen phone) to control engine speed, quilling whistle, bell, boost/brake, start up sequences, etc.  - I'd say everything other than programming engines which would be simpler through the app itself.  The phone would show the Cab3 App with engine data, throttle position, etc.

I understand this is out of scope from what you set out to do, I was just offering a hardware option that might be able to work before having to cobble something together.

There is another train control app called Bluerail (not related to TMCC/Legacy at all). The Bluerail app allows you to control the app directly using a MFI controller.  The Cab3 app to my knowledge doesn't allow MFI/bluetooth control of the app - which is why I was thinking perhaps you could use the bluetooth controller input to the raspberry pi to activate the code you have written for TMCC/Legacy.

Here's a video demonstrating it for Bluerail.

There is https://youtu.be/plinKYtuwrM

Last edited by JD2035RR
@JD2035RR posted:

The idea is to provide a tactile hand held remote (rather than a touch screen phone) to control engine speed, quilling whistle, bell, boost/brake, start up sequences, etc.  - I'd say everything other than programming engines which would be simpler through the app itself.  The phone would show the Cab3 App with engine data, throttle position, etc.

I understand this is out of scope from what you set out to do, I was just offering a hardware option that might be able to work before having to cobble something together.

There is another train control app called Bluerail (not related to TMCC/Legacy at all). The Bluerail app allows you to control the app directly using a MFI controller.  The Cab3 app to my knowledge doesn't allow MFI/bluetooth control of the app - which is why I was thinking perhaps you could use the bluetooth controller input to the raspberry pi to activate the code you have written for TMCC/Legacy.

Here's a video demonstrating it for Bluerail.

There is https://youtu.be/plinKYtuwrM

Huh, cool! I was unfamiliar with BlueRail. They must have essentially enabled Bluetooth game controller control of their app.

Although I've never done it, you can pair a BlueTooth controller to an iPhone and use it as an "assistive touch" device to control the cursor and operate an iPhone. Someone should order one of these things and see if it can operate the Cab 3!

https://support.apple.com/en-us/111775

  -- Dave

I've started work on the Restful API for PyTrain.

http://127.0.0.1/engine/72 -->

{ "current_speed": 0, "road_name": "Union Pacific GP35", "road_number": "0742", "tmcc_id": 72 }


I was happy to see I could rather quickly adapt the PyTrain main program to support being called as a provider by another process. From here on out, I want to minimize changes to the main PyTrain GitHub project and do the api-specific stuff in a new GitHub project: PyTrainApi. The one exception is I will probably add the ability to get state info in JSON form...

  -- Dave

Some quick screen shots. I have to think a bit about syntax...

http://127.0.0.1:5000/train/20 -->
{ "control": "Legacy", "direction": null, "labor": 12, "max_speed": null, "momentum": 0, "road_name": "Delaware & Hudson", "road_number": "0020", "rpm": 0, "scope": "train", "smoke": null, "speed": 0, "speed_limit": null, "tmcc_id": 20, "train_brake": 0, "year": null }


http://127.0.0.1:5000/acc/15 -->
{ "block": "on", "road_name": "Upper Main North Power District", "road_number": "0015", "scope": "power_district", "tmcc_id": 15 }


http://127.0.0.1:5000/sensor_track/50
{ "last_loco_lr": 255, "last_loco_rl": 255, "road_name": null, "road_number": null, "scope": "irda", "sequence": "slow_speed_normal_speed", "tmcc_id": 50 }


http://127.0.0.1:5000/acc/50
{ "road_name": "Upper Main West", "road_number": "0050", "scope": "sensor_track", "tmcc_id": 50 }
http://127.0.0.1:5000/switch/1 -->
{ "road_name": "Gantry 1", "road_number": "0001", "scope": "switch", "state": "thru", "tmcc_id": 1 }


I'm trying out the flask-restful package. It only took about 40 lines of code to generate this output. If others have suggestions for restful-api frameworks that they've used in Python, I'd love to hear about them.

What's interesting is that I've basically written a wrapper around the PyTrain CLI, I can literally issue any  dmcc/legacy command I want to. I will probably write simplified POST handlers for speed, direction, bell, horn, and any other common engine commands, but then I'll add an endpoint that takes a CLI command, giving me access to the entire Lionel command set.

  -- Dave

@cdswindell great idea to separate the API implementation from the main package.

I recommend using the FastAPI framework. And also using pydantic to validate requests.

Separately you might consider using UV as the package manager and as the recommended installation method for users of pytrain.

Using uv can help make sure python version and other dependencies are consistent across installs and upgrades over time.

@kwilliamm posted:

@cdswindell great idea to separate the API implementation from the main package.

I recommend using the FastAPI framework. And also using pydantic to validate requests.

Separately you might consider using UV as the package manager and as the recommended installation method for users of pytrain.

Using uv can help make sure python version and other dependencies are consistent across installs and upgrades over time.

Thanks. I was reading about FastAPI last night and was planning to give it a try this morning. I liked that it helps generate the API docs.

  -- Dave

I’ve finished the first pass at the API. It’s checked into GitHub as PyTrainApi. Documentation is nonexistent yet, but if you want to run it, set up a separate virtual environment, download the code, install the requirements with pip, and type the command:

fastapi run src/pytrain_api/fastapi_ex.py


Note you need to use run and not dev, as the PyTrain log file causes the fastapi server to keep restarting (I have to figure this out, one day). With the server running, hit up:

Http://<server ip address>:8000/


and you will get the doc pages displayed.

If anyone does give this a try, please let me know, I'm definitely looking for feedback on the API syntax.

Now on to the Alexa integration...

@kwilliamm posted:

@cdswindell great idea to separate the API implementation from the main package.

I recommend using the FastAPI framework. And also using pydantic to validate requests.

Separately you might consider using UV as the package manager and as the recommended installation method for users of pytrain.

Using uv can help make sure python version and other dependencies are consistent across installs and upgrades over time.

@kwilliamm, I really want to thank you for pointing me towards FastAPI. I finished implementing the functional endpoints I wanted to build around 6:00 this evening. After dinner, I decided I'd see what was involved with adding some kind of password security to what I did. Two hours later, I have OAuth password protection with real JWT tokens! I didn't even know what some of that meant 2 hours ago 😂. Their online documentation/tutorials are some of the best I've seen, and I've been in the business since the 1970s...

PyCharm just added support for uv. For now, I'm going to stick to what I have (GitHub workflow pushing new versions to pypi) but I will have a look once I've further down the road with the Alexa integration.

Oh, and as I use Netgear routers and have a DynamicDNS entry already defined pointing to where I live in Boston, I've even used the API remotely to trigger actions on my little switcher engine in my office (a Chesapeake & Ohio NW2). It was very cool to see it move down the tracks and blow its horn via a remote web session!

Thanks again!!

  -- Dave

P.S. for others that may read this post, this framework let me develop a complete REST API to my app with about 3 days of effort, and this includes the time to learn the tool and rework PyTrain to support the API.

Last edited by cdswindell

Working with the Alexa code is no where near as much fun as working with FastAPI 😢. I am finding ways to do it, but my goal was to develop and publish an Alexa skill others could use as well. There is a lot of configuration needed to set up an HTTPS proxy server in front of the PyTrainApi. Maybe another approach will become clear as I proceed, but this one may not be for the feint of heart 😵‍💫.

@cdswindell posted:

Huh, cool! I was unfamiliar with BlueRail. They must have essentially enabled Bluetooth game controller control of their app.

Although I've never done it, you can pair a BlueTooth controller to an iPhone and use it as an "assistive touch" device to control the cursor and operate an iPhone. Someone should order one of these things and see if it can operate the Cab 3!

https://support.apple.com/en-us/111775

  -- Dave

I wonder if one of these would work.  I could see using the iphone proper for most things, then use this widget to control your current train and getting a few key buttons for speed, dir, horn, whistle, halt, etc...

Attachments

Images (2)
  • mceclip0
  • mceclip1
Last edited by swise

Add Reply

Post
×
×
×
×
Link copied to your clipboard.
×
×