William Collins and Professional Services Program Development Manager Jennifer Lu introduce Itential DevStack – a containerized local development environment that deploys the complete Itential stack with a single command. No manual configuration, no certificate management headaches, no multi-step setup processes.
DevStack emerged from a need for an updated way to test locally without impacting customer environments. The result: a containerized local development environment that deploys Itential Platform, Automation Gateway, handles mTLS certificate generation, and even spins up the Itential MCP Server so you can plug into AI-powered infrastructure workflows immediately.
What You’ll See:
One-Command Setup
- Clone the repository and run make setup
- Itential Platform, Gateway, and MCP server deploy automatically
- Certificate generation and MTLS configuration handled automatically
- Fresh environment ready in seconds, not hours
Service Import and Workflow Creation
- Import Ansible, Python, and OpenTofu services with one command
- Services stored in Git, decoupled from execution layer
- Build workflows in Studio using imported services
- Run distributed automation across gateway infrastructure
Disposable Infrastructure Pattern
- Commands that simplify testing, debugging, and data preservation
- Eliminate configuration drift between test cycles
- Start every test from a known clean state
Why It Matters
Eliminates Local Development Barriers
Itential DevStack removes the complexity of setting up infrastructure locally. What used to require manual platform deployment, gateway configuration, IAG5 setup, and system integration now happens with one command. Network engineers can prototype MCP servers, test FlowAI agents, and build automation workflows without touching production systems.
Built for Modern Development Workflows
Keep test environments clean, catching bugs faster, and ensuring demos work from scratch every time by followimg disposable infrastructure principles. The containerized architecture makes adopting new Itential features simple: FlowAI support required just two file changes and a pull request.
Expands Automation & Orchestration Access Across Teams
Serve POC teams, product support, AI innovation demos, and anyone building automation in the Itential Platform. The open-source approach removes barriers to entry – no custom Nexus repositories, no access restrictions, just clone and run.
Video Notes
(So you can skip ahead, if you want.)
00:00 Team Introduction
04:38 DevStack Use Cases
06:56 Live Setup Demo
10:43 Gateway Service Configuration
12:48 Workflow Studio Demonstration
14:26 Container Lifecycle Management
15:40 POC Team Applications
17:32 Flow AI Integration
19:17 Wrap Up & CollaborationView Transcript
William Collins • 00:09
I have been working on this pretty cool project here at Itential called Itential DevStack. It is basically a containerized local, kind of like a local development environment that gives you the complete, complete, complete Itential experience on any machine that you want to run it on. It has a container engine, preferably Docker. And basically, you launch it with one command. That’s it. And that single command, just like less than a fraction of a 2nd of typing, if you type as fast as me, it deploys Itential platform, Itential Gateway. It handles the mundane, just.
William Collins • 00:54
Treachery of generating certs for the MTLS connection. It pushes those certs to Itential platform. It configures the client-side cert. It actually attaches the gateway on your machine to platform and it automatically. Instantiates the Itential MCP server. So you can plug and play that sucker with all the natural language crowd that want to talk to their infrastructure. Now, I’ve been doing a lot of development behind this project lately, but the incredible mind from an incredible human that I’ve really enjoyed working with recently, Jin Liu, is behind the idea and also has driven many of the features and behaviors and everything that have made it, you know, that have made their way into the project.
William Collins • 01:48
So it’s been an incredible experience. But how are you doing today, Jen? And do you want to introduce yourself before I keep ranting?
Jenn Lu • 01:55
Yeah, no, absolutely. Thanks for having me on here to talk about and demo the DevStack project with you. I’m really, really excited to show this to everyone. But I work at professional services. Love that you gave me all the kudos for this project, but I can’t take all the credit. It was definitely a team effort from everyone. A lot of people within PS.
Jenn Lu • 02:21
We worked alongside the platform team at Itential who works alongside customers for deploying the software, worked alongside a co-op who actually built out everything that you then took and created the actual DevStack project. And so it was definitely a huge team effort. But like I said, I worked within professional services, started off as a network automation engineer. Back in 2021, I’ve been working with customers to build out use cases for them, as well as teaching them how to use our platform and build out use cases themselves within our within Itential platform. And then most recently, I’ve been took on the role of a program development manager. And I currently oversee a lot of the programs within PS, implementing processes for those programs, as well as creating really cool tooling to help with program execution. So it’s been a lot of fun.
William Collins • 03:24
I love it. And kind of, you know, you said a few things there. So this kind of this idea kind of circulated. It’s something that like folks from PS and platform kind of wanted to do. But then, you know, we have a great co-op program. It’s incredible. And so you had a co-op come in that really started prototyping the original version of this.
William Collins • 03:46
And right, because you’re so busy. We’re all so busy doing our day jobs and all the other things we have to do. So we have to, yeah. So that’s kind of how it went. I love that.
Jenn Lu • 03:56
Yeah.
William Collins • 03:57
Let me share my screen real quick. And while I’m doing that, do you want to kind of, I guess my take on it is it kind of eliminates the barrier to entry. So network engineers and developers can kind of explore and test potential capabilities, new things coming out, like without impacting, without even touching production. So, you know, whether it’s internal here at IT, like I use this thing every day now to test and prototype things. Oh, that’s awesome. Because if you, if you spin up an MCP server, you need quick infrastructure to test it against. So, hey, why not do that with one command instead of figuring out a bunch of things?
Jenn Lu • 04:38
Exactly.
William Collins • 04:38
But what else would you all be using it for? Is that kind of the gist of it?
Jenn Lu • 04:43
Yeah, so originally the project started out with: hey, we need a more updated way for professional services engineers to get a local development environment running on their platform. A lot of our engineers work with customers. They’re in the customer’s infrastructure, in the customer’s environment, building out really cool things and delivering really cool things for them. But the problem is, what happens if you need to test something locally that you can’t do within a customer’s environment? What if you want to build your own cool automation, orchestration, automation use cases in the platform separate from what you’re building out for a customer? How do you do that? We had a very old project that we called Local Cloud that also ran the platform within Docker.
Jenn Lu • 05:27
But the problem with that project is that it hadn’t been updated in a while. It didn’t support P6 and all the new features that P6 had to offer. It didn’t support IAG5, the service execution layer. And it also had a lot of really old legacy code in there. And so that’s kind of what brought this project to fruition: we needed a more up-to-date way of running the latest and greatest stuff Itential had to offer. And so we landed on this DevStack project, which we originally coined the, I think it was called the Docker Compose Development Lab, which is a mouthful. Itential DevStack definitely sounds a lot cooler in my opinion.
Jenn Lu • 06:18
But yeah, so that’s, and now it’s really kicked off. I mean, originally in my head, it was just a way to help our internal folks, mainly PS and maybe even product support or engineers within product support get the platform up and running, test out features, build out cool stuff. But I think now with using what William has built, we’re kind of reaching, we’re planning to use it for demos, for demos for POCs, to demo flow AI. The possibilities of this project are honestly endless.
William Collins • 06:56
And you said it’s that other project wasn’t getting updated as much. As you can see from this project, we are on fire. Like we are doing lots of stuff. It is continually updated, continually up to date with all the modern, latest, and greatest. So, yeah. Thank you for that. That’s that adds a lot of clarity.
William Collins • 07:16
And hey, why don’t we just jump in? I have my screen here, and I’m going to start literally completely from scratch, folks. So I’m going to, let’s see. I’m pure William-like fashion. Of course, I didn’t prepare anything. This is completely off the cuff. So I’m going to clone the repo down.
Jenn Lu • 07:40
This is the best way to do a demo.
William Collins • 07:44
Yes. Great. So I’m going to go ahead and switch into that directory. And when I said one command earlier, I meant like one quick, very fast command. So all I have to do is make setup. And this is going to go through and do everything I talked about earlier on the phone call. And one thing to call out here.
William Collins • 08:12
Especially when you’re doing things like this, you’re doing like development work, you’re testing, you’re trying to find bugs, you’re testing new things. The best way to do that is following those patterns of disposable infrastructure. So, whenever I get done testing something, I want to be back to square one. I don’t want persistent data pockets of that in different places. I want to be back to a clean working environment with everything fresh. And that allows you to catch bugs a little bit quicker and just have a clean working environment. So, if you say, Hey, we want to provide this to a customer or a prospect or a demo or whatever it is that we’re doing, we know that it’s going to work from scratch from the ground up.
William Collins • 08:55
So, we see these are running now. And one thing that I need to do real quick: so I have a platform and a container, and I have like a gateway server that basically aggregates all the other gateways. Like, think of it like a cloud gateway with like runners in different places doing different things. So, my, that’s like a very much oversimplification, but my local environment here, I have the gateway software installed so I can actually connect to the gateway platform. So, I need to log into that. And a new password one more time. So now, my on my working machine, that’s connected to the primary gateway server that’s connected to platform.
William Collins • 09:59
And so, what I can do now, I’m actually going to switch over and go to And log in. And what makes this great is like one of the things that’s just tedious. When I said like certificate management’s tedious, it really is. So when you’re on a fresh local working environment dealing with essentially ephemeral infrastructure for testing, you don’t need like, you know, you’re not scaling for an enterprise. So that makes automation a little bit easier. So we can see that I’ve got my cluster.
William Collins • 10:43
The cluster is connected. All the certs are good. Everything, you know, for this instance, it’s self-signed. And if I go to services, I can see I don’t have anything. But I do have a fresh gateway connected to platform. So now what I can do is, since I logged in from my machine here, I’m actually going to go to my. I do this in some of my pipeline testing as well.
William Collins • 11:06
So we have a free product called Torero that the code under the hood is basically the same as like Itentials Gateway 5. So all the commands kind of work the same. So I wrote this hello world examples for kind of like pipeline testing and other things, just quick, easy. And so what I can do is just copy this command. And in this import file, you’re going to see the hello world things that I’m going to bring in. So there’s a, it’s going to define the repository, and then it’s going to define three different services. So one is an Ansible hello world, one is a Python, one is an OpenTofu.
William Collins • 11:47
And all I got to do is paste that import command in here, except, well, I need to change the command syntax a little. Yeah, that should work. Aye, there we go. Bang. So now when I go and I refresh the UI, I should see services in there. Yes. Okay.
William Collins • 12:17
So, what this, like, imagine, like, I know we’re doing hello world. It’s not exciting. It’s just for demo purposes. But imagine these services being, you know, whether it’s Python, Ansible, you know, OpenTofu, like your actual infrastructure is code and your files, they’re in Git. They are completely decoupled, the logic from like the actual execution here. So I’ve imported those just with one command. And then we can just run these things.
William Collins • 12:48
So I can go to studio here. I can create a workflow. What do we call this workflow? I don’t know.
Jenn Lu • 12:56
We can follow the same logic as the hello workflow. Follow the same logic as the Taro example scripts or services.
William Collins • 13:05
There we go. Create. And all I got to do to run these things is I’m just going to look for service here. Run service. And then I’m just going to connect. Look, look at me connecting dots. Always connecting dots here.
William Collins • 13:23
It’s as easy as that. I click on the run service and I choose what gateway. So imagine in like a distributed architecture, you have tons of gateways. Probably. I’m going to click on this one since I only have one. And then I’m going to select, there’s my services. So let’s do Python.
William Collins • 13:46
I was actually in Python quite a bit earlier. So save, run. And we just. I just did some hello world action here. So I can double-click on that. I can see the output. So, yeah, that’s about it.
William Collins • 14:09
And then when I’m done with all this, I just don’t want to do it anymore. I tested my use case. I just want to take everything down. It wipes out everything. No persistent data. Everything is wiped out. Containers are down.
William Collins • 14:26
And I also, there’s a make down. So if you want to keep the persistent data, if you want to, maybe you have a long-lived use case and you’re going to be testing tomorrow as well. And you don’t want to spend the time to do what I just did. You can just do make down. Persistent data is still there. And then it’s off to the races. So.
William Collins • 14:47
Yeah.
Jenn Lu • 14:49
I really like how I wasn’t aware that there was a persistent data feature. I thought that once you bring a container down or once you run Makedown, all the data is just deleted. So I think it’s really cool that you can, even though you bring everything down, you can still keep the data and bring it up when you need it again.
William Collins • 15:09
Yeah. I did that actually out of necessity. So I was testing something a while back and I thought. I had to go through and do a make setup again. I was thinking, you know what? I know, like, from 1st principles, I want to start from scratch, but I had this demo up and I’m like, okay, I want a way to just shut down the containers, but keep most of the stuff, like the persistent things there. So it just makes sense.
William Collins • 15:32
For sure. You know, I don’t do that as much. I like to start from scratch if I can, but yeah. So, what do you think? Do you think, where do you think this project’s going to go?
Jenn Lu • 15:40
No, that’s a really good question. I think one team that can really benefit from this project is our proof of concept team, our POC team. Originally, they had their own custom way of allowing customers to actually, I think it was also using a Docker Compose project, but it wasn’t open source. Customers, Ebulu is kept in a Nexus repository. So, customers, we had to give customers that access. And so, there was a lot of barriers to entry to pull down the image. There was still some setup steps involved.
Jenn Lu • 16:13
One thing that William did a really good job at was automating a lot of the setup steps with the makeup, make down, make clean, make setup steps. So, like he said in the beginning, just one click and you can have everything up and running. And as he and he showed it during the demo, which I think is really, really cool. Before we had the dev stack, there was a lot of manual steps involved. So, it was the steps were like you had to get the platform up and running, and then you had to get IAG up, and then you had to get IAG5 up, and then you had to go in and set up everything yourself and integrate all those systems with the platform. So. Now, POCs, it’s saving us a lot of time.
Jenn Lu • 16:57
If they were to, when they do end up using this DevStack tool, customers can get everything running. All you need to do is embed the platform assets and embed all of the data, the persistent data, which we talked about earlier, into that environment. And so, POC is one team that would be able to use it. Another one would be, I believe that our AI innovation program, we’re doing demos for that program using this tool as well to get Flow AI up and running. Is that right, William?
William Collins • 17:32
Oh, yeah. It required one, like one little, and that’s the beauty about containers. You know, doing things with containers and back to those, you know, keeping the ephemeral disposable infrastructure principles makes adopting new things a little bit easier. So it was like just a few changes to like, I think two. Two files and a simple PR, and then we were able to load Flow AI very, very easily. Just starting and doing working these projects with the right principles from the beginning goes a long way with scaling and adding new things over time. And I just want to thank you and your team for packing me with ideas and spending time with me to help me understand.
William Collins • 18:19
Because, look, I’m just doing my thing, and professional services is boots on the ground, talking to customers, doing deployments. They’re doing the real work around here, getting things done and really providing tremendous value to our customers. So since I’m not in the trenches like you all are, in order to build something good that’s actually valuable, what do you need to do? You need to talk to the folks in the trenches. And you all have been more than gracious with your time to pack me with ideas, explain things, demo things, and really get me to understand so I can turn that into features into stuff like this. So thank you. And thank you to the WPS team.
Jenn Lu • 19:04
You’re welcome. It was so much fun working together on this. I really loved it. Hopefully, we’ll have more projects coming soon. Fingers crossed. I really hope so. But no, this was a super fun project.
Jenn Lu • 19:16
It was great working together on it.
William Collins • 19:17
Great to see how it came out at the end because I remember at the beginning I was thinking, oh, we’ve got a lot of stuff to do. I don’t know about this. And then, you know, sitting down weekly and really, you know, digging in and working through it has just been a blast. All right. Well, that’s it. Goodbye, everybody. Thank you so much for joining me, Jen, taking time out of your busy day.
William Collins • 19:38
This has been fun. Yeah, you’re a natural. This is Jen’s 1st time on a thing like this, so you did great. Appreciate the time.
Jenn Lu • 19:46
Thank you. This is a lot of fun.