top of page

Using YAML and Python to build a simple orchestrator.



 

Hello,

This is part 1 of a multipart article series.

I've have a small library of scripts I'm writing up to work with some network devices and accomplish tasks for me.

They all take arguments and are based on the same 'template', so the format of the arguments are all the same, but the output of the script and the actions it takes would be different.

They each take device_ip (string), config_set (python list) and debug (bool True or False).

They run from the CLI in a similar format to:

python example_script.py x.x.x.x ['commands.yaml'] false
 
<OUTPUT OMITTED >

But what if I wanted to orchestrate changes during a change window?

I could run the scripts manually in the order I need with the config sets that belong to each device, but then what's the point of the automation to begin with?

Instead I thought, okay, why not write a simple tool to orchestrate and execute on the scripts!

I don't want to write any code during a change window, or run any commands in any sequence, or do much of anything that will

  1. Take up time at night, or

  2. cause me to make mistakes.

I DO need to be able to leverage all the scripts, from a single file.

Thus, result is the following proof of concept code to take a YAML file, devices.yaml, and define the parameters there. I'd then import the structure into a Python dictionary to compose the execution order.

Define the devices.yaml file.

---

device_1:

    ip: "x.x.x.x"

    config_set: [

        "conf t",

        "interface gi1",

        "shutdown"

        ]

    debug: False

device_2:

    ip: "x.x.x.x"

    config_set: [

        "conf t",

        "interface gi2",

        "shutdown"

        ]

    debug: False 

device_3:

    ip: "x.x.x.x"

    config_set: [

        "conf t",

        "interface gi3",

        "shutdown"

        ]

    debug: True

Next we can look at the code that responsible for serializing the YAML data into a Python dictionary. I will use the YAML and subprocess modules built into Python to run the actual commands once the commands are assembled.

import yaml
import subprocess


with open("devices.yaml", 'r') as stream:
    yaml_file = yaml.full_load(stream)


for value in yaml_file:
	print(f"python example_script.py 
        {yaml_file[value]['ip']} 
        {yaml_file[value]['config_set']} 
        {yaml_file[value]['debug']}")


    bashCommand = ["python", "configure_devices.py", 
                    "yaml_file[value]['ip']", 
                    "yaml_file[value]['config_set']", 
                    "yaml_file[value]['debug']"]
                    
    process = subprocess.run(bashCommand)


"""
<SAMPLE OUTPUT>
['python', 'configure_devices.py', "yaml_file[value]['ip']", "yaml_file[value]['config_set']", "yaml_file[value]['debug']"]
['python', 'configure_devices.py', "yaml_file[value]['ip']", "yaml_file[value]['config_set']", "yaml_file[value]['debug']"]
['python', 'configure_devices.py', "yaml_file[value]['ip']", "yaml_file[value]['config_set']", "yaml_file[value]['debug']"]

# The commands would look something like this
python example_script.py x.x.x.x ['conf t', 'interface gi1', 'shutdown'] False
python example_script.py x.x.x.x ['conf t', 'interface gi2', 'shutdown'] False
python example_script.py x.x.x.x ['conf t', 'interface gi3', 'shutdown'] True
"""

The subprocess module would then run each command formed. Additional work is needed to get the YAML file read in correctly to be able to stage changes for multiple devices, and thus I will return with an updated copy of code when fully implemented.

Thanks for checking out this article,

Have a nice day and keep an eye out for part 2 of this article.

T


450 views0 comments
bottom of page