Mastering Python for DevOps: Top 15 Interview Questions and Answers

TLDR: This blog post covers the top 15 Python DevOps interview questions, providing detailed answers and real-world scenarios. It includes automation of AWS EC2 instances, Kubernetes pod status checks, Docker image cleanup, database backups, Jenkins job monitoring, JSON log parsing, user creation on remote servers, file monitoring, DNS record management, log file rotation, Prometheus metrics fetching, service management on remote servers, IAM user listing, memory usage monitoring, and Kubernetes application deployment.

In today's competitive job market, preparing for a DevOps interview can be daunting. This blog post will explore the top 15 Python DevOps interview questions, complete with detailed answers and real-world scenarios. Whether you're preparing for a job interview or looking to sharpen your skills, these questions will give you the edge you need.

1. Automating the Creation and Management of Amazon EC2 Instances

Question: How would you automate the creation and management of Amazon EC2 instances using Python?

Answer: To automate EC2 instance management, you can use the boto3 library, which is the AWS SDK for Python. Here’s an example script:

import boto3

ec2 = boto3.resource('ec2')
instance = ec2.create_instances(
    ImageId='ami-12345678',
    MinCount=1,
    MaxCount=1,
    InstanceType='t2.micro'
)
print(f'Created instance with ID: {instance[0].id}')

This script creates an EC2 instance by specifying the AMI ID, instance type, and the number of instances to launch.

2. Fetching and Displaying the Status of a Kubernetes Pod

Question: How can you use Python to fetch and display the status of a Kubernetes pod?

Answer: You can utilize the kubernetes Python library to interact with your Kubernetes cluster. Here’s how:

from kubernetes import client, config

config.load_kube_config()

v1 = client.CoreV1Api()
namespace = 'default'
pod_name = 'my-pod'

pod_status = v1.read_namespaced_pod(name=pod_name, namespace=namespace)
print(f'Pod status: {pod_status.status.phase}')

This script loads the Kubernetes configuration and retrieves the status of a specified pod.

3. Cleaning Up Old Docker Images

Question: Write a Python script to clean up old Docker images on a server to free up space.

Answer: You can use the docker library to manage Docker containers and images. Here’s an example:

import docker

client = docker.from_env()

images = client.images.list(filters={'dangling': True})
for image in images:
    client.images.remove(image.id)
print('Cleaned up old Docker images.')

This script removes all dangling images from the Docker environment.

4. Automating Database Backups to S3

Question: How would you use Python to automate database backups and upload them to an S3 bucket?

Answer: You can use the subprocess module to execute backup commands and boto3 to upload files to S3. Here’s an example:

import subprocess
import boto3


subprocess.run(['mysqldump', '-u', 'user', '-p', 'database_name', '>', 'backup.sql'])


s3 = boto3.client('s3')
s3.upload_file('backup.sql', 'mybucket', 'backup.sql')
print('Backup uploaded to S3.')

This script creates a MySQL database backup and uploads it to an S3 bucket.

5. Monitoring Jenkins Job Status

Question: Write a Python script to monitor the status of a Jenkins job and send an alert if it fails.

Answer: You can use the Jenkins API library and smtplib for email notifications. Here’s an example:

import jenkins
import smtplib

server = jenkins.Jenkins('http://jenkins_url', username='user', password='password')
job_name = 'my_job'

job_info = server.get_job_info(job_name)
if job_info['lastBuild']['result'] != 'SUCCESS':
    with smtplib.SMTP('smtp.example.com') as smtp:
        smtp.sendmail('from@example.com', 'to@example.com', 'Job failed!')
print('Alert sent if job failed.')

This script checks the last build status of a Jenkins job and sends an email alert if it fails.

6. Parsing JSON Logs

Question: How can you use Python to parse JSON logs and extract specific information?

Answer: You can use Python's built-in json module to parse JSON logs. Here’s an example:

import json

with open('logs.json') as f:
    logs = json.load(f)

for entry in logs:
    if 'error' in entry['level']:
        print(entry)

This script reads a JSON log file and prints entries that contain errors.

7. Creating Users on Multiple Remote Servers

Question: Write a Python script to create a new user on multiple remote servers.

Answer: You can use the paramiko library to connect to remote servers via SSH. Here’s an example:

import paramiko

servers = ['server1', 'server2']
username = 'newuser'
password = 'password'

for server in servers:
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh.connect(server, username='user', password='password')
    ssh.exec_command(f'sudo adduser {username}')
    ssh.close()
print('Users created on all servers.')

This script connects to multiple servers and creates a new user on each.

8. Monitoring File Changes

Question: How would you use Python to monitor a file for changes and trigger an action?

Answer: The watchdog library can be used to monitor file system events. Here’s an example:

from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler

class MyHandler(FileSystemEventHandler):
    def on_modified(self, event):
        print(f'{event.src_path} has been modified.')

observer = Observer()
observer.schedule(MyHandler(), path='path/to/file', recursive=False)
observer.start()
try:
    while True:
        pass
except KeyboardInterrupt:
    observer.stop()
observer.join()

This script monitors a specified file for modifications and prints a message when it changes.

9. Querying and Updating DNS Records with AWS Route 53

Question: How can you use Python to query and update DNS records using AWS Route 53?

Answer: You can use the boto3 library to manage DNS records. Here’s an example:

import boto3

route53 = boto3.client('route53')

def update_dns_record(hosted_zone_id, record_name, record_type, values):
    change = {
        'Changes': [{
            'Action': 'UPSERT',
            'ResourceRecordSet': {
                'Name': record_name,
                'Type': record_type,
                'TTL': 300,
                'ResourceRecords': [{'Value': value} for value in values]
            }
        }]
    }
    route53.change_resource_record_sets(HostedZoneId=hosted_zone_id, ChangeBatch=change)

This function updates or creates a DNS record in Route 53.

10. Managing and Rotating Log Files

Question: Write a Python script to manage and rotate log files on a server.

Answer: You can use the shutil module to rotate log files. Here’s an example:

import shutil
import os

log_file = 'app.log'
backup_file = f'app_{time.strftime('%Y%m%d_%H%M%S')}.log'
shutil.move(log_file, backup_file)
open(log_file, 'w').close()
print('Log file rotated.')

This script renames the current log file and creates a new empty log file.

11. Fetching Metrics from a Prometheus Server

Question: How can you use Python to fetch metrics from a Prometheus server?

Answer: You can use the requests library to query the Prometheus API. Here’s an example:

import requests

response = requests.get('http://prometheus-server/api/v1/query', params={'query': 'up'})
metrics = response.json()
print(metrics)

This script fetches metrics from a Prometheus server.

12. Restarting a Service on Multiple Remote Servers

Question: Write a Python script to restart a service on multiple remote servers if a specific process is not running.

Answer: You can use the paramiko library to check and restart services. Here’s an example:

import paramiko

servers = ['server1', 'server2']
process_name = 'my_process'
service_name = 'my_service'

for server in servers:
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh.connect(server, username='user', password='password')
    stdin, stdout, stderr = ssh.exec_command(f'pgrep {process_name}')
    if not stdout.read():
        ssh.exec_command(f'sudo systemctl restart {service_name}')
    ssh.close()
print('Checked and restarted services as needed.')

This script checks if a specific process is running and restarts the service if it is not.

13. Fetching the Current List of IAM Users

Question: How can you use Python to fetch the current list of IAM users in an AWS account?

Answer: You can use the boto3 library to list IAM users. Here’s an example:

import boto3

iam = boto3.client('iam')
response = iam.list_users()
for user in response['Users']:
    print(user['UserName'])

This script retrieves and prints the names of all IAM users in the AWS account.

14. Monitoring Memory Usage of a Process

Question: Write a Python script to monitor the memory usage of a process and kill it if it exceeds a threshold.

Answer: You can use the psutil library to monitor processes. Here’s an example:

import psutil

process_name = 'my_process'
threshold = 100 * 1024 * 1024  # 100 MB

for proc in psutil.process_iter(['name', 'memory_info']):
    if proc.info['name'] == process_name:
        if proc.info['memory_info'].rss > threshold:
            proc.kill()
            print(f'Killed {process_name} for exceeding memory threshold.')

This script checks the memory usage of a specified process and kills it if it exceeds the defined threshold.

15. Automating the Deployment of a Kubernetes Application

Question: How can you use Python to automate the deployment of a Kubernetes application using a YAML configuration file?

Answer: You can use the kubernetes library to deploy applications. Here’s an example:

from kubernetes import client, config, utils

config.load_kube_config()

with open('deployment.yaml') as f:
    utils.create_from_yaml(client.ApiClient(), f)
print('Kubernetes application deployed.')

This script loads a YAML configuration file and deploys the application to the Kubernetes cluster.

Conclusion

These top 15 Python DevOps interview questions cover a range of essential skills and scenarios that you may encounter in a DevOps role. By familiarizing yourself with these questions and practicing the provided scripts, you will be well-prepared for your next interview. Good luck!