Tuesday, August 15, 2017

My Reading List - 2017

I spend a lot of time commuting. In an attempt to make my commute as productive as possible, I like to read books. I thought it might be useful, for other people out there on the interweb looking for some interesting reads, to share my reading list. So here it is.

  • SmartCuts - Shane Snow
  • The Lean Start-up - Eric Ries
  • Articulating Design Decisions - Tom Greever
  • How to win friends and influence people - Dale Carnegie
  • Presence - Amy Cuddy
  • The Like Switch - Jack Schafer
  • Start with Why - Simon Sinek
  • The practice of cloud system administration (DevOps / SRE Practices)
  • Site Reliability Engineering - Heaps of people.
  • The Phoenix Project - Gene Kim and some others.
  • HBR Guide to persuasive presentation - Nancy Duarte

Thursday, December 03, 2015

All Five!

Happy to report I've now passed all five AWS Certifications!

AWS Certified Solutions Architect - Associate

AWS Certified Developer - Associate

AWS Certified SysOps Administrator - Associate

AWS Certified Solutions Architect - Professional

AWS Certified DevOps Engineer - Professional



Tuesday, October 13, 2015

AWS Certified Solutions Architect - Professional



This past week whilst at AWS re:Invent I'm pleased to report that I sat and passed the AWS Certified Solutions Architect Professional level exam.

Having passed the exam I thought I’d share some of my thoughts and experiences on the exam. Hopefully if you are planning on taking the exam, some of these "pearls" will help you prepare, at least mentally for the exam.

The Exam

To start with, it's important to remember that in order to qualify to take this exam you must already hold the the AWS Certified Solutions Architect - Associate badge.

The exam is 170 minutes long and includes 80 questions. 

You'll be measured on 8 domains of knowledge, with the largest percentage of the marks going to "Security". This kind of makes sense considering AWS makes no secret about how security is of paramount importance in everything they do.

This white paper is a very very good read ... 

You can read all about the exam here ... 

The Questions 

Most of the questions are LONG scenario based questions. I felt a bit like I'd run a marathon by the time I clicked the "Submit" button at the end of the exam.

The best piece of advice I can offer here is READ the question completely and READ the answers completely. After all, this is a professional grade architecture exam and as Solutions Architects, there is an expectation that we can extract key requirements from a given scenario.

Unlike many of the other exams I've taken, where a set of answers typically includes two or three options  which, to the trained eye, are obviously incorrect, most answer-sets in this exam contained a full range of answers that could all be correct. However, what you are being asked is to choose the best answer based on the scenario provided. Again, it's important to read and extract the key points or requirements from each question. 

Considering the number of questions and time available, time management is really important. You need to allow yourself around 2 minutes per question. 

I personally found working in 20 minute blocks with a target of 10 questions each block helped me manage my time.

That said, after reviewing my marked answers and going back to review, I only ended up with about 9 minutes remaining on the clock.

Marking for Review

This brings me to another piece of useful advice for keeping time on your side. Use the check later tick box to help you push ahead with the exam if you end up getting stuck on a particular question. You can go back and have another crack at the end.

Preparation

In terms of study guides and resources, I personally found that the "Practice Exam", available for $40 USD through Kryterion, was an excellent starting point. Not only does it give you a feel for the questions and the time constrains you have to work with, it also provides a breakdown (once you've completed the exam) of how you performed in each of the domains. For me this helped guide me on some of the areas I needed to focus on.

Other materials I would recommend all candidates read ... 

AWS White Papers are a must:

Read through the product FAQs:

The reality is that there is not substitute for real-world experience. I have personally worked with the AWS platform for around 3 years which gave me a solid foundation on which to prepare for the exam.

Conclusion

In closing, I think this is a great exam which really tests broad set of skills. The exam prep guide really should be used as a starting point for planning your study. 

The exam tests you’re knowledge of a range of AWS services which can be a challenge if the scope of your work has been limited to a smaller subset of the more commonly uses services. 

Don't forget that Security carries a lot of weight in terms of the overall mark, so really make sure you understand things like IAM, roles, polices, federation, web identity. 

There is also a new "Well Architected Framework" document that AWS published recently. This is definitely worth a read because it will help you understand the best practices that should be applied to your thinking when you make architectural decisions.


Good luck!

Tuesday, June 23, 2015

Teach a man to fish ...

Today somebody asked me a question which I thought warranted a blog post. For the purpose of this blog post "somebody" will be referred to as Jeff.

So, Jeff came to me with a problem. Jeff had set out to build a particular solution in AWS. During his investigations he found an off-the-shelf CloudFormation template which deployed the exact solution he wanted.

Jeff downloaded the CloudFormation template from GitHub, logged in to the the AWS management console and ran through the "Create new Stack" wizard. Jeff was on top of the world, the solution was being built in front of his very eyes and so far, all he'd had to do was a bit of googling and a few mouse clicks.

He was grinning like a cheshire cat, life was good, CloudFormation was working it's magic and he was going to be the office hero ... right up until the moment he saw the dreaded ROLLBACK_IN_PROGRESS message.

S**! Jeff thought to himself as he watched his beautiful solution torn down, volume-by-volume, instance-by-instance, ELB-by-ELB.

He opened up the CloudFormation template using his trusted copy of Sublime and this is what he saw:

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer nec odio. Praesent libero. Sed cursus ante dapibus diam. Sed nisi. Nulla quis sem at nibh elementum imperdiet. Duis sagittis ipsum. Praesent mauris. Fusce nec tellus sed augue semper porta. Mauris massa. Vestibulum lacinia arcu eget nulla. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos."

Well, he didn't really see that, but I'm sure you can appreciate that to the untrained eye, CloudFormation templates can definitely look a little scary.

That's when Jeff decided to call me and ask for a little help. We walked through the template and tried to identify the reasons for the failure, which are beyond the scope of this post. One thing which did become clear from the silence on the other end of the line was that Jeff was struggling a little bit to keep up with my troubleshooting approach, how did we get from Error Message A to Solution B.

Jeff then reminded me of the famous quote "Give a man a fish and you feed him for a day, teach him how to fish and you teach him for life".

Now, just to put a little context around my friend Jeff, he's a very smart developer. It would not take Jeff long to "learn how to fish". But, what were the best resources to help Jeff "learn to fish".

AWS have an awesome documentation library and below I've included a few of the, in my personal opinion, best links for getting to grips with CloudFormation.

This first link is a great starting point for anyone wanting to start out with CloudFormation and understand the building blocks of a template:

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-anatomy.html

This next link is a the bible of AWS CloudFormation resources. It provides an invaluable breakdown of every resource type you can create through CloudFormation. Definitely my first stop when handcrafting and troubleshooting templates.

http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-template-resource-type-ref.html

I'm sure I'll hear from Jeff again. But I know that armed with new arsenal of new links, he will try his absolute best to catch that fish on his own first. He may not succeed, but he will learn a lot and with each attempt, this;

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer nec odio. Praesent libero. Sed cursus ante dapibus diam. Sed nisi. Nulla quis sem at nibh elementum imperdiet. Duis sagittis ipsum. Praesent mauris. Fusce nec tellus sed augue semper porta. Mauris massa. Vestibulum lacinia arcu eget nulla. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos."

Will start to look more like this;

  "Resources" : {
    "EC2Instance" : {
      "Type" : "AWS::EC2::Instance",
      "Properties" : {
        "InstanceType" : { "Ref" : "InstanceType" },
        "SecurityGroups" : [ { "Ref" : "InstanceSecurityGroup" } ],
        "KeyName" : { "Ref" : "KeyName" },
        "ImageId" : "ami-2323232"
      }
    },

Wednesday, June 10, 2015

Programming monogamy

I decided it was high time I bought an end to my exclusive relationship with PHP and started playing the field a little.

I'm not suggesting that there is anything wrong with PHP. Exactly the opposite in fact, not coming from a development background, I've been able to achieve some fantastic things with PHP. My journey from "total PHP newb" to "not so much of a PHP newb" has introduced me to a new whole new world and some very interesting concepts. Starting to understand application development and some of the application programming principals has helped improve my understanding of software development within my company. It has enabled me to have more constructive conversations with developers across my organisation, this was especially relevant when it came to re-factoring some of our software solutions for AWS.

That aside, I recently met this little beauty who goes by the name of Ruby. Why Ruby and not Python or Node or some other scripting language. The main motivator was Rails. Rails is a web application framework that I've heard lots about and am keen to explore.

A lot of the PHP work I've done has been around building web applications and browser based consoles for managing environments and services within our environment. I never got really stuck into a framework for developing applications in PHP, I normally handcraft everything with a simple MVC structure, like the one below.

This practice got hammered home thanks to the cover-to-cover reading of "PHP for Absolute Beginners", which I'd highly recommend to anyone looking to get started with PHP or web application development in general.

Anyway, I digress, since Ruby and Rails "appear" to go hand-in-hand, I decided to start learning some Ruby.

I wanted to start, as I did with PHP, with something really simple. Since most of what I do these days resolves around AWS, pulling back a list of EC2 instances and dumping them to the console seemed like a perfect place to start.

In my first little script below, I've created an empty hash, built a function which uses the AWS SDK for ruby to return a list of instances and populate the hash with a subset of the information returned.

It then iterates through the hash, using the awesome .each method and spits out a "nicely" formatted report to the console.

Pretty basic, but it gave me the chance to get an grasp on some basic Ruby concepts like symbols and the awesome .each method.

These simple scripts inevitably form the building blocks for larger and more complex solutions, so sit tight and lets see where Ruby and I go from here.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
#!/usr/bin/ruby 

require "aws-sdk"

$instance_hash = Hash.new('Nothing New')

################################
# => Function: get_ec2_instances
# => returns all running EC2 Instances for my account.
###############################

def get_ec2_instances
 ec2 = Aws::EC2::Client.new(region: 'ap-southeast-2')

 resp = ec2.describe_instances()
 resp[:reservations].each do | reservations |
  reservations[:instances].each do | instances |
   $instance_hash[instances[:instance_id]] = 
    {
    "accountId" => reservations[:owner_id],
    "state" => instances[:state][:name], 
    "privateIp" => instances[:private_ip_address]
    }
  end
 end
end

get_ec2_instances

$instance_hash.each do |key,value|
 puts "Intstance Id: #{key}"
 value.each do |k,v|
  puts "#{k} : #{v}"
 end
 puts "-" * 25
end

Monday, June 01, 2015

Route53 + RaspberryPi + Cron + PHP =  lazy admin.

Thanks to my recently AWS-connected Pi, performing a scheduled DNS cutover from the comfort of my own bed could not have been easier.

With a little cron magic and some "Aws\Route53\Route53Client" you can easily schedule changes to Route53 records / record sets. 


<?php
error_reporting(E_ALL);
ini_set("Display Errors", 1);
require 'vendor/autoload.php';
// Create client object for Route53
$r53Client = \Aws\Route53\Route53Client::factory(array());
// Create client object for SES
$SesClient = \Aws\Ses\SesClient::factory(array(
    'region' => 'us-east-1'
));
// Function for sending notifications if the record change fails or for confirmation that the change has been made.
function StackNotification($body, $cnameDns)
{
    global $SesClient;
    $stackSubject = 'DNS Update Confirmation ' . "[$cnameDns]";
    $SesClient->sendEmail(array(
        'Source' => 'blahblah@mitchyb.com',
        'Destination' => array(
            'ToAddresses' => array(
                'blahblah@mitchyb.com'
            )
        ),
        'Message' => array(
            'Subject' => array(
                'Data' => $stackSubject
            ),
            'Body' => array(
                'Html' => array(
                    'Data' => $body
                )
            )
        )
    ));
}
function updateRecord($elbDns, $cnameDns)
{
    global $r53Client;
    global $cloudFormationStackName;
    // Update DNS Records
    try {
        $command = $r53Client->changeResourceRecordSets(array(
            'HostedZoneId' => 'Z16PRLGBWGMRUY',
            'ChangeBatch' => (object) array(
                'Changes' => (object) array(
                    array(
                        'Action' => 'UPSERT',
                        'ResourceRecordSet' => array(
                            'Name' => $cnameDns,
                            'Type' => 'CNAME',
                            'TTL' => 60 * 5,
                            'ResourceRecords' => array(
                                array(
                                    'Value' => $elbDns
                                )
                            )
                        )
                    )
                )
            )
        ));
        
        $msg = "Route53 record updated to " . $elbDns;
        StackNotification($msg, $cnameDns);
    }
    catch (Exception $e) {
        $errorMsg = "Route53 record update failed with error: $e";
        trigger_error($errorMsg);
        
        StackNotification($errorMsg);
        exit;
    }
}
;
// Call the record set update function.
updateRecord('offline.mitchyb.com', 'blog.mitchyb.com');

Wednesday, May 27, 2015

Installing the AWS SDK for PHP onto my Raspberry Pi


Just some notes for my own reference on getting the AWS SDK for PHP working on my RaspberryPi.

1. Installed PHP and Apache (needed for this particular project)

pi@raspberrypi ~ $ sudo apt-get install apache2 php5 libapache2-mod-php5

2. Moved my project folder and created the composer.json file (as per the SDK installation instructions).

nano composer.json

3. Popped in the required JSON.

{
    "require": {
        "aws/aws-sdk-php": "2.*"
    }

}

4. Ran the installer and ... bop-bow! ....

pi@raspberrypi ~/phpscripts/greenmode $ php composer.phar install
Loading composer repositories with package information
Installing dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.

  Problem 1

    - aws/aws-sdk-php 2.4.0 requires guzzle/guzzle ~3.7.0 -> satisfiable by guzzle/guzzle[v3.7.0, v3.7.1, v3.7.2, v3.7.3, v3.7.4].

... and

 guzzle/guzzle v3.9.3 requires ext-curl * -> the requested PHP extension curl is missing from your system.

4. Loaded up the php5-curl package ....

pi@raspberrypi /var/www $ sudo apt-get install php5-curl

5. Everything working!

pi@raspberrypi ~/phpscripts/greenmode $ php composer.phar install
Loading composer repositories with package information
Installing dependencies (including require-dev)
  - Installing symfony/event-dispatcher (v2.6.8)
    Downloading: 100%         

  - Installing guzzle/guzzle (v3.9.3)
    Downloading: 100%         

  - Installing aws/aws-sdk-php (2.8.7)
    Downloading: 100%    

Just to be sure ... let's do something .... 

A quick Ec2 iterator script ...

<?php 

        require 'vendor/autoload.php';

        $ec2Client = \Aws\Ec2\Ec2Client::factory(array(
                'profile' => 'dev',
                'region'  => 'ap-southeast-2'
        ));


        function allInstances(){

                $iterator = $GLOBALS['ec2Client']->getIterator('describeInstances',array(
                        'Filters' => array(
                                        array(

                                                'Name' => 'instance-state-name',
                                                'Values' => array('running')

                                                )

                                        ))
                                );

                foreach($iterator as $object){

                        echo $object['InstanceId'] . PHP_EOL;
                }

        }


echo print_r(allInstances(),true);

... and we get our instances back. Sweet!

pi@raspberrypi ~/phpscripts/greenmode $ /usr/bin/php sdktest.php
i-4fcc3a81
i-4585708b
i-b4a6697a

i-960efa58

A little about Me

My photo
My name is Mitch Beaumont and I've been a technology professional since 1999. I began my career working as a desk-side support engineer for a medical devices company in a small town in the middle of England (Ashby De La Zouch). I then joined IBM Global Services where I began specialising in customer projects which were based on and around Citrix technologies. Following a couple of very enjoyable years with IBM I relocated to London to work as a system operations engineer for a large law firm where I responsible for the day to day operations and development of the firms global Citrix infrastructure. In 2006 I was offered a position in Sydney, Australia. Since then I've had the privilege of working for and with a number of companies in various technology roles including as a Solutions Architect and Technical team leader.