Intro

When you are starting on the network automation world, one of the first considerations you have is; how you can I automate this device?. The most common answer is to just write a script that connect to the device, execute some commands, print/parse the output and done. However many times we don't pay too much attention if this is the best way to do.

On this post I want to briefly share my thoughts on the options available; CLI or API when automating network devices.

CLI based automation

This is the most common scenario in our industry at this moment, and is the one I described above. You create a script using Ansible or python, you send some commands to the device (and sometimes also the credentials) via TELNET or SSH, expect a return, parse the output and based on the parsed string you continue your script.

One of the downside of this kind of automation is that you, as a network engineer, need to parse the output of the commands you sent. Which sometimes is not easy, for example;

  • Which part of the output is the prompt of the device?
  • Is the command you sent on the output you captured?
  • Which is the output of the command you sent?
  • How are you going to parse the output of the command you sent?

Now, imagine if you are sending multiple commands at once, you need to come with an idea on how to parse the output of the individual commands.

As you can see this can be time consuming and error prone when you are developing, so you end up using third party libraries to make sure this works. Which is perfectly fine, but what if I told you there is a better way.

API Based Automation

This is the kind of automation I like, a machine-to-machine communication, where you can have a reliable, secure and structure automation with the network devices.

To start dealing with this kind of automation you need to be aware of the following components:

  • Data Models
  • Transport Protocols
  • Encoding Schemas
  • API Protocols

Data Models

Have you created an script that send a command to the device and later found out it came with an error as you didn't send a valid input? probably is fine with a small script, but with larger deployments this could be bad, specially if you misconfigured something.

Don't you think, it can be nice to know exactly what the device is expecting for this specific configuration and what will be the response?. Then Data Models can be your friend.

Data Models describes the syntax and semantics when working with specific data objects.

One example of data model is Yang, which describes a set of rules and structures on how data can be retrieved and how the answer will be, it also validates if the data to be sent is valid or not.

Data Models just describes how the structure of the data you are working on, but does not describe how the data is exchanged. For that, you need an transport protocol.

Transport Protocols

Now that we know how to work with our data, we need to start communicating with our network devices. The Transport protocols are in charge for the data exchange and provide features like encryption. The main consideration is what our API uses, for example; if you are going to use NETCONF you can't use HTTP as it not supported by NETCONF.

Common transport protocols are:

  • SSH
  • TLS
  • HTTP(S)

Encoding

Encodings means how our data is formatted when is being exchanged. At this moment, XML and JSON are main to formats to encode our data but there are other formats used like Google Protocols Buffers.

The encoding piece is one of the great benefits of using API based Automation, since this is the part that will provide a structured communication.

Encoding formats our data in a structured way following the rules imposed by Data Models

This makes easier to work with data in a programmatic way. Remember the emphasis I did at the beginning about parsing? this is the key part that makes our life easier.

Depending of the format used, there are rules that should be followed. Personally I prefer JSON since it is cleaner and easier for read, but XML is a good choice as well.

API Protocols

Finally, we reach to the last part, which are the protocols that communicates directly with the devices and execute our instructions. Today we have three main protocols:

  • RESTCONF
  • NETCONF
  • gRPC

However, this list can quickly become obsolete, since quite fast new protocols are emerging, but the idea of this kind of protocol is to provide a programmatic way to interact with the network devices and provide nice features, like configuration modes. For example in NETCONF you have the startup, running and candidate configurations which gives you a great flexibility over a CLI fashion.

When CLI Automation is good

There are a few cases where CLI Based Automation is better than the API approach, for example when dealing with legacy devices that don't support any kind of API, the CLI approach will be the only one to work.

Another case could be when you are creating an script that will be on-time and the complexity is low, then a CLI approach could be good.

And if you are working with Terminals Consoles, CLI could be your only approach. Personally, if it is available, I will look for a Zero Touch Deployment - ZTP or similar approach. Try to stay away from automate terminal consoles!.

Summary

Below is a handy table I found while studying for this topic, I consider is quite useful to see quickly the relationship between several protocols and the Encoding, Content and Payload the used.

If you noticed, SNMP is in the table but not covered in the article, that's because try not to use SNMP for automation.

Protocol Schema / Encoding tooltip icon Content Payload
None
Custom
Text
SMIv2
MIBs
ASN.1 BER
XML /
JSON
XSD /
JSD
XML /
JSON
Yang
Yang Modules
XML /
JSON
Protocol Buffer
Service /
Message
Binary

Conclusion

I hope this introduction to Automation is useful, by no means I'm an expert on the topic, but I want to share what an API based Automation is, why is better and the core components. In the future I'm planning to dive deep into each one the protocols presented.