Categories
Cloud Cyber Security Data Analysis Linux Project Management Splunk

Big Things Have Small Beginnings….

The story of my home lab…

I have always been a tinkerer but recent months made me think of a versatile but low power lab.

When it comes to labs you can really go down the rabbit hole and spend thousands on different enterprise servers, firewalls, routers. 

Choice of Hardware

It just takes a brief look on ebay to yield results.

I wanted to start cheap, affordable but effective so I ended up buying an old HP Elite-desk 800.

It had 16gb of RAM bundled with Windows 10 running off an SSD.

I decided to open it up and add some extra components to create an erroneously called Frankenstein’s Monster of a system: A spare Sandisk 240gb SSD and 1tb SSHD (old but gold).

I also had some RAM lying around from an old system of a similar provenance: 8gb extra for 24gb of RAM in total.

Choice of software

In the first iteration of my lab I decided to use the industry standard and very compatible Proxmox Server to run multiple virtual machines for now.

As I add to it (with small factor ‘tiny’ computers) I will probably install Vmware’s well regarded ESXi software in the newer hardware.

Proxmox Console
Proxmox Console
Pi Hole
Pi Hole
Splunk Pi Hole App
Splunk Pi Hole App
Apache Superset
Apache Superset
Jira
Jira

What am I running?

Installing the different types of software on Linux variants was a great excuse to revise my Linux skills:

  • Splunk – Running on Ubuntu Server: connected to PiHole and gathering data, I have learned to be generous with RAM and SSD space for this virtual machine.
  • PiHole – Ubuntu Server running Pi Hole and Splunk Universal forwarder. It is newer variant of the software and hardware I used for my capstone project.
  • Gitlab – A prebuilt virtual machine to learn more about DevOps.
  • A Windows 10 VM – I use the Microsoft made Remote Desktop software to access it and it runs very well.
  • Jira by Atlassian – Running on Ubuntu Server in a Docker container – A great way practice my PM software skills.
  • Apache Superset – Running on Ubuntu Server in a Docker container – New software and a great way to keep my data analysis/visualisation skills sharp.

Again, the aim in running the variety of software is not to be absolute expert in DevOps but to have a very good understanding, and to be able to communicate with future staff who reside in that space.

Pitfalls:

It is easy to get seduced by the fantastic homelabs for show on reddit: https://www.reddit.com/r/homelab/

but as long as you remember why you are using your lab, one can stay in control.

HP EliteDesk Working as a Server
The Mini Lab
You can see the UPS at the bottom. I kept the router on top of the metal container as it acted as an improvised Faraday cage.

What Next?

I plan to deploy a firewall: pfsense, and add cheap commodity network switches. I also want to add my Lenovo Workstation with a Graphics Card Passthrough. And maybe a few mini workstations with ESXi installed.

Categories
Cloud

Creating and Designing a Diagnostic Test using Qualtrics

Many moons ago I was picked as a technical consultant to design and implement an online test. It was conceived as a diagnostic numeracy and literacy test to help student teachers prepare for their LANTITE exams.

The initial idea showed so much promise that leaders at the organisation wanted to develop and commercialise a product based on this (a nice idea but we already beaten to market by others).

The Literacy and Numeracy Test for Initial Teacher Education Students (the test) is designed to assess initial teacher education students’ personal literacy and numeracy skills to ensure teachers are well equipped to meet the demands of teaching and assist higher education providers, teacher employers and the general public to have increased confidence in the skills of graduating teachers.

ACER LANTITE website

As further explanation the following information is provided:

All students enrolled in an initial teacher education course (either undergraduate or postgraduate) will be expected to sit and meet the test standard prior to graduation. Some higher education providers may also require you to sit the test as part of their course entry requirements. In addition, some states and territories may require successful test results to complete your final practicum to register or to be employed as a teacher.

I approached it as designing a simulation of a test: I would design a bank of questions dealing in different topics, store them in a question bank and when a student completes the test: questions would be picked from the bank at random.

Once students successfully completed the test they would be provided with feedback on their strengths and weaknesses along with links to resources design to assist them in getting stronger in certain topics.

Custom input validation making use of regular expressions

A custom introduction page was designed so that students enter relevant information (e-mail in this case) before commencing the test.

Results and feedback would be send to this address upon completion of the test.

Input validation using regular expressions
Input validation using regular expressions

This is what the regular expression looked like
REGEX sample

In Practice

Fortunately the app is still active and I was able to log in with my credentials to access it.

Below are a set of screenshots of a simulation of the application running on both the desktop and mobile.

As mentioned above I added a regular expression to act as an input validation. The user would need to enter the correct information in the appropriate format.

See what happens when I don’t enter the correct information?

Once entered I am able to commence the test.

Once complete the results are emailed to the user.

Example of an e-mail send to a user:

As you can see the above test provides a breakdown by categories of where the student did well and where they didn’t.

Tricks to use to manipulate data in Qualtrics:

Qualtrics Embedded Data

Embedded data is any extra information you would like recorded in your survey data in addition to the question responses. It can be used to store categorised information in this project. The reading comprehension assessment was broken down into: Access, Interpretation and Evaluation whilst the writing part was divided into Grammar Punctuation, Spelling and Text Organisation.

Using Embedded Data in Your Survey

Once embedded data has been added to the survey, it can be used in a number of ways, including:

  • Piped text to display embedded data in your survey questions.
  • Branch logic to determine what sections of a survey your respondent should take.
  • Display logic to determine whether a respondent should see a question.
  • Email tasks and email triggers to determine whether an email should be automatically sent upon completion of a survey, this was how it was used in the project.
  • Data & Analysis and Reports as part of your analysis.

Overall over 6600 students used the tool over the course of several weeks.

Categories
Cloud Cyber Security Linux Raspberry Pi Splunk

Pi Hole as a Cyber Security Tool pt 4: The change in the browsing experience

A list of allowed and blocked queries

Introduction

Here is a link to the original LinkedIn article. Welcome to the 4th and final part of my series: Pi Hole as a Cyber Security tool.

I have written in length about how to install and set up the system. DNS Sinkholes can be used as a form of security hardening: aligned to different frameworks. The Australian ‘Essential Eight’ comes to mind. They can be deployed in the Enterprise, but these articles are designed for you and I to deploy the technology in your home network.

Over the course of a month, the Pi Hole has allowed over 1.2 million DNS queries from my home network alone. Over 200 thousand queries have been blocked: these domains have been linked to domains known for tracking/telemetry, in total around 16% of traffic has consisted of domains designed to track where you and I go.

Change your browser

In addition to this to harden your network further one can deploy more secure web browsers such as Brave. It is available across most hardware and software platforms.

Out of a comparison by School of Computer Science and Statistics at Trinity College Dublin, Brave was shown to not send data ‘back home’. It was compared with Google Chrome, Mozilla Firefox, Apple Safari, Brave Browser, Microsoft Edge and Yandex.

For Brave with its default settings they did not find any use of identifiers allowing tracking of IP address over time, and no sharing of the details of web pages visited with backend servers. Chrome, Firefox and Safari all share details of web pages visited with backend servers. For all three this happens via the search autocomplete feature, which sends web addresses to backend servers in real-time as they are typed.

In Chrome a persistent identifier is sent alongside these web addresses, allowing them to be linked together. In addition, Firefox includes identifiers in its telemetry transmissions that can potentially be used to link these over time. Telemetry can be disabled, but again is silently enabled by default. Firefox also maintains an open web-socket for push notifications that is linked to a unique identifier and so potentially can also be used for tracking and which cannot be easily disabled.

Safari defaults to a choice of start page that potentially leaks information to multiple third parties and allows them to preload pages containing identifiers to the browser cache.

Safari otherwise made no extraneous network connections and transmitted no persistent identifiers, but allied iCloud processes did make connections containing identifiers.

The effect of Pi Hole on the browsing experience using different websites

Website requests reduced by an average of 400% resulting in a faster, safer browsing experience. This was tested on Google Chrome on the Mac running Monterey.

Before Pi Hole: news.com.au

622 requests, 6.1mb transferred, 12.5mb resources, Finish 21.76s

With Pi Hole: news.com.au

122 requests, 2.1mb transferred, 2.9mb resources, finish 2.6 seconds

Before Pi Hole: theage.com.au

585 requests, 6.5mb transferred, 14.4mb resources, finish 31.84s

After Pi Hole: theage.com.au

78 requests, 1.8mb transferred, 4.8mb resources, finish 11.28

Before Pi Hole: theverge.com

657 requests, 10.4mb transferred, 22.1mb resources, finish 30.89s

After Pi Hole: theverge.com

53 requests, 1.3mb transferred, 2.6mb resources, finish 6.40s

As you can see, just deploying the Pi Hole to do its basic functions results in a markedly superior browsing experience. I would like to test this further in the future using older hardware (old iPads, old Android Phones). In addition I would like to install Pi Hole and Splunk on a single server running at home in combination with an open source firewall such as PfSense.

Who knows? Maybe that can form the basis of a new set of articles?

Conclusions

The lessons learned from this project are as follows:

  • Browsing data is tracked.
  • This traffic is a significant part of your web traffic.
  • Educate yourself about the risks, manage and mitigate said risks.
  • Share your knowledge with others.
  • Harden your network.

Thank you for your time!

Categories
Cloud Cyber Security Database Linux Raspberry Pi Splunk

Pi Hole as a Cyber Security Tool pt 3: Installing a Splunk Server

Pi Hole app on Splunk displaying some data on queries

Introduction

This is the third part in a series about using a Pi Hole as a Cyber Security tool. My previous articles are available and easy to access. Click here for the original LinkedIn article.

Methodology

  • Install Pi Hole on Raspberry Pi
  • Create Ubuntu Server in Cloud
  • Install Splunk on Ubuntu Server.
  • Install Splunk Universal Forwarder on Raspberry Pi.
  • Accessing Splunk Pi Hole App.

Create an Ubuntu Server in the Cloud

Splunk can run on multiple platforms but in this instance I felt comfortable using an Ubuntu Server variant in the cloud.

What is Splunk?

“Splunk is a software platform to search, analyse and visualise the machine-generated data gathered from the websites, applications, sensors, devices etc. which make up your IT infrastructure and business.”

There is a Pi Hole app that runs on Splunk that receives data from a specific Pi Hole, ingests it and visualises the data.

Creating an Ubuntu Server in the Cloud

Sign on for an account at: https://www.digitalocean.com/ a cloud service provider

When you login this screen will greet you:

To create a new server click on the ‘droplets’ tab:

Choose the following options:

  • Ubuntu 20.0.4.
  • Shared CPU Plan.
  • Regular Intel with SSD.
  • The $20 a month plan.
  • Manually format and mount.
  • Choose the Singapore Region.
  • Password authentication chosen.
  • Tick ‘enable backups’
  • A new Ubuntu Virtual Machine will be created for you. Take note of the IP address as it will be needed later.

Regional options:

INSTALLING SPLUNK ON UBUNTU 20.04.3

First change directory to /opt/

cd /opt/

Using the wget command copied from the Splunk Website:

This was accurate as of the time of the original posting (December 2021). In subsequent implementations of the server I have found that the version of Splunk Server has changed along with the command.

Copy the command:

wget -O splunk-8.2.3-cd0848707637-Linux-x86_64.tgz ‘https://download.splunk.com/products/splunk/releases/8.2.3/linux/splunk-8.2.3-cd0848707637-Linux-x86_64.tgz’

Which looks like this:

Using the following command unzip the file:

 tar zxvf splunk-8.2.3-cd0848707637-Linux-x86_64.tgz

Change directory to splunk/bin

Start the Splunk Server: sudo ./splunk start

Once started: go to the (IP address of server):port 8000 or simply (example) 101.168.1.101:8000

 Login with your registered credentials.

 Browse the Splunk App store for the Pi Hole DNS App and Pi-Hole Add-On for Splunk:

Install both.

Ensure you are in the correct time zone

Another crucial thing is to set the correct time zone in both the Pi Hole (Raspberry Pi) and Splunk Server. In the past these systems have defaulted to Greenwich Mean Time (maybe the systems know I am English). At any rate this will provide incorrect data to Splunk and it will be displayed accordingly.

To check the timezone of your Ubuntu based system type the following command:

timedatectl

A sample output would look like this:

           Local time: Wed 2020-05-06 19:33:20 UTC
       Universal time: Wed 2020-05-06 19:33:20 UTC
             RTC time: Wed 2020-05-06 19:33:22    
            Time zone: UTC (UTC, +0000)           

System clock synchronized: yes
NTP service: active
RTC in local TZ: no

To list all available time zones type the following:

timedatectl list-timezones

Once you identify which time zone is accurate to your location, run the following command as sudo user:

sudo timedatectl set-timezone Australia/Melbourne

Here is what you should get:

Ubuntu system showing the correct time for the location.

Adjust accordingly to where you are located.

Configure Splunk Indexes

Login to Splunk Web

Navigate to Settings

Under the section DATA select Indexes

Click on New Index in the upper right-hand corner

Provide an index name of “dns”.  The other default settings should be okay for this setup

And click ‘save’.

Once done you will need to complete some basic configuration within Splunk:

Configuring Pi Hole DNS Add-on

Create a new input (source of data)

The API Key can be found in the web admin interface of the Pi Hole:

Check the update details option, leave settings as is:

Install Splunk Technical Add-ons and Apps

Login to your splunk web instance

Install Splunk Common Information Model (CIM)

Login to Splunk Web

Depending on where you are in the GUI, either click the gearbox next to Apps OR use the App Drop down and select Manage Apps

Choose find more apps

Search for CIM and choose Splunk Common Information Model (CIM)

If prompted, enter your credentials for the Splunk website, and continue with installation.

Install Force Directed App for Splunk

Login to Splunk Web

Depending on where you are in the GUI, either click the gearbox next to Apps OR use the App Drop down and select Manage Apps

Click on find more apps

Search for “Force Directed”

Install the Force Directed App for Splunk

If prompted, enter your credentials for the Splunk website, and continue with installation.

Configure Receiving Port

Go to settings->API/Web interface then copy and paste the API key.

  • Create Input
  • Navigate to the settings tab
  • Click “Forwarding and receiving”

Configure the receiving data section:

Installing Splunk Universal forwarder for Raspberry Pi

We’re almost there! We have configured our Pi Hole, we have installed a Splunk Server and configured it. Now we just need to install a Universal Forwarder App onto the Raspberry Pi itself.

Install the Splunk Universal Forwarder on the RPI (ARMv6) – this is the software used to forward data into Splunk:

Link to Download: https://www.splunk.com/en_us/download/universal-forwarder.html

(hint, once you start the download you can view a wget link on the right – you can copy this and download directly onto your RPI)

Download Splunk Universal Forwarder from the website (to be installed in the Raspberry Pi)

Use the provided command and chose the forwarder for ARM Linux (Raspberry Pi runs on ARM architecture). Login to your Raspberry Pi and enter the following commands:

wget -O splunkforwarder-8.1.5-9c0c082e4596-Linux-arm.tgz ‘https://download.splunk.com/products/universalforwarder/releases/8.1.5/linux/splunkforwarder-8.1.5-9c0c082e4596-Linux-arm.tgz’

tar xvzf splunkforwarder-8.1.5-9c0c082e4596-Linux-arm-manifest -C /o

Run sudo ./splunk start from $SPLUNK_HOME/bin

And this is what happens when you first start the universal forwarder

Hopefully, it should ask you to accept an EULA and set a username and password, if that works, you’re in!

There is a little bit more configuration needed, go to the following directory: 

/opt/splunkforwarder/etc/system/local

 find and edit the following file:

sudo nano outputs.conf

Add the line: server=IP address of Splunk Server:Port number 9997

Then configure Pi Hole For High Quality Logging:

Use the following command:

sudo nano /etc/dnsmasq.d/02-pihole-splunk.conf

Add following to the configuration file. Then save and close:

log-queries=extra

SSH into to your Pi-hole

Update the inputs.conf file

In this instance it would be located in /opt/splunkforwarder/etc/apps/search/default

sudo nano /opt/splunkforwarder/etc/apps/search/default/inputs.conf

Paste the following into the text editor

#inputs.conf
[monitor:///var/log/pihole.log]
disabled = 0
sourcetype = pihole

#optionally specify an index, if configured.
index = dns
[monitor:///var/log/pihole-FTL.log]
disabled = 0
sourcetype = pihole:ftl

#optionally specify an index, if configured.
index = dns

restart the forwarder:

splunk [start|stop|restart]

Then enter the following commands:

restart pi-hole

pihole restartdns

Splunk and the Pi Hole DNS App

Log on to the Splunk Server:

Open the Pi Hole App:

Success! The data is being sent over from the Pi Hole to the Splunk Server. You can add more clients to the Pi Hole and add more domains to the blocklists.

The next article discusses the effect of the Pi Hole on the browsing experience.

Categories
Cloud Cyber Security Linux Raspberry Pi Splunk

Pi Hole as a Cyber Security Tool pt 2: Install Pi Hole on Raspberry Pi

Pi-Hole Web Admin Interface

Introduction

This article is a copy of the original posted on LinkedIn. My previous article I introduced the concept of using Pi Hole software running on the Raspberry Pi as a Cyber Security tool. This undertaking was part of a capstone project to complete a recent course. Pi Hole is effectively ‘DNS Sinkholing’ software.

A DNS sinkhole can be a very effective tool when it comes to thwarting malware attacks. Every day, unwitting users click on a link to a malicious domain. It prevents devices from connecting to bad domains in the first place. Configuring the DNS server to supply a false IP address redirects the client elsewhere. It’s a simple but surprisingly effective tool in your arsenal to proactively protect your network using DNS.

A DNS sinkhole is a specific example of a false telemetry system (aka sinkhole server, internet sinkhole, and blackhole DNS). It can also be used to protect users from visiting known malicious or phishing sites. Thus, DNS sinkholes can be used for both malicious and investigative and defensive purposes.

This article will cover how to install Pi Hole on a Raspberry Pi. The tutorial assumes that you have flashed your MicroSD card with Raspbian and have it up and running.

Methodology

  • Install Pi Hole on Raspberry Pi – to be covered by this article
  • Create Ubuntu Server in Cloud
  • Install Splunk on Ubuntu Server.
  • Install Splunk Universal Forwarder on Raspberry Pi.
  • Accessing Splunk Pi Hole App.

Installing Pi Hole on Raspberry Pi:

Connect the Raspberry Pi to your Router via Ethernet Cable and switch it on.

If necessary, you may need to logon to your router to reserve the IP address of the Raspberry Pi. Your router DHCP lease may run out and the result will be that the IP address of your Raspberry Pi will change. By reserving the IP address of the Pi on the router it will effectively stay the same. This will be important later when configuring devices on your network to use the ‘Pi Hole’.

If you are using a Mac you can access the Pi using the Terminal and via the SSH command.

Simply type: ssh pi@ipaddress and use the default login name and password for your Raspberry Pi. If using Windows you can use PuTTY.

Once logged in: use the commands:

sudo apt-get-update

sudo apt-get upgrade -y

To install Pi Hole: curl -sSL https://install.pi-hole.net | bash

Choose eth0 as the interface to use with Pi-hole. Press Tab to move the red highlight to Ok and then press Enter. This is our Gigabit Ethernet port which will provide the best possible connection.

Select your upstream DNS provider. We chose Google, but there are many others to choose from. Press Tab and then Enter.

 Accept the default list of blocked sites by pressing tab and enter.

Accept the default IPv4 and IPv6 protocols by pressing tab and enter.

Accept the current network settings and set them as static. Do make a note of the details as we will need them later.

Install the web admin interface by pressing tab and enter.

 Install the lighttpd web server used to serve the web admin pages by pressing tab and enter.

Accept the default log options.

 Accept the default privacy mode by pressing tab and enter.

The installation is complete and the final page recaps the IP address of the Pi-hole device and provides an admin webpage login password.

 Change the web admin password in the terminal using the following command. You will be prompted to type in the password twice.

$ pihole -a -p

Open a web browser to the IP address of your Raspberry Pi, and type /admin. For our setup we went to http://IP address of Pi Hole/admin/ 

You now have access to the Pi Hole web interface/dashboard.

Point your devices to Pi Hole

Using iOS

Go to the settings menu

Press on the right arrow on the Wi Fi network, scroll down the settings to ‘configure dns’ and press on this.

You will be given an option of changing your DNS server:

Choose the ‘manual’ option then add the IP address of the Raspberry Pi/Pi Hole as a server and save.

Using Android

This is based on the vanilla Android variant found in Pixel and certain other devices.

Go to the menu where you choose your Wi Fi network. Press on the pencil/edit sign to make a change

You will be presented with a menu like this:

Change the DNS Server to the IP address of the Raspberry Pi.

These steps can be duplicated for Windows 10 and MacOS systems too. Google is your friend in this instance.

The next part will cover installing an Ubuntu Server in the cloud and installing Splunk on that server.

Categories
Cloud Cyber Security Linux Raspberry Pi Splunk

Pi Hole as a Cyber Security Tool pt 1

The Premise

The following blog entry is a copy of my Linkedin article.

Hello world! Until recently I was enrolled in an intense programme of study designed for people looking to start a career in Cyber Security.

This is the first post in a series about my project. I am using this medium to gather my thoughts and share my experiences with the world.

To conclude the course I had to complete a Capstone Project on a Cyber Security related topic. I currently work in a unique position where I am one part educator and one part technologist.

The Methodology

So I thought why not use this unique background as a foundation of my project? To raise awareness to others of Cyber Security to others? My plan of action was this:

  • Use the Raspberry Pi currently collecting dust as a Pi Hole (DNS Sinkhole)
  • Connect the device to the home router.
  • Configure the device (Install a Universal Forwarder) to send data to a separate Splunk Server in the cloud.
  • Analyse the data within Splunk.

The Raspberry Pi has become the third best selling computer of all time. It’s a very commonly available multipurpose device that runs a variety of operating systems. It is most well known for running versions of Linux (Raspbian being an ARM based form of Debian). A good familiarity with Linux and the command line was required for the installation and configuration of the software across multiple devices.

What is Pi-hole?

Pi-hole is a network-wide ad blocker. Rather than installing ad-blocker software on every device and every browser, one can install Pi-hole once on your home network, and it will protect all devices. Because it works differently than a browser-based ad-blocker, Pi-hole also block ads in non-traditional places, such as in games consoles and on smart TVs.

How does it work?

Pi-hole functions as an internal, private DNS server for your home network. For many home users, this service is already running on your router, but your router doesn’t know where advertisements are. Pi-hole does (It stores thousands of domains on its blocklist). Pi-hole will intercept any queries for known ad-serving domains and deny them access, so ads won’t be downloaded.

How does this relate to Cyber Security?

To strengthen the Cyber Security posture of your organisation one needs to enhance the knowledge of employees. I believe that good Cyber-Security starts at home and that one should ‘practice what you preach’. Over the course of 10 days and collecting large amounts of data I discovered that a significant proportion of my web traffic consisted of domains tracking users and many attempts to send telemetry data ‘back home’. Hardening the computer network also is part of the Australian Cyber Security Centre’s ‘Essential Eight’ maturity model. Under the section of ‘user application hardening’: ‘web browsers do not process web advertisements from the internet’.

In the next parts I will share how I installed and configured the different components and got them to talk to each other.

Categories
Cloud Cyber Security Linux Raspberry Pi Splunk

Presentation of my Capstone Project of my Cyber Security course:

“Pi-Hole as a Cyber Security Tool”

This project investigated the effects of Pi Hole on telemetry data, tracking, malware and adware.

Categories
Azure Data Analysis Database PowerBI SQL

If you cannot measure it, you cannot improve it…..

When you can measure what you are speaking about, and express it in numbers, you know something about it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely, in your thoughts advanced to the stage of science.

Lord Kelvin

Introduction

The above quote from Lord Kelvin sets the tone for this blog post. This is also true about the projects and programmes I’ve worked on: how do you measure success?

The Problem (To Solve)

Many years ago, I remember visiting a colleague in her office. She was of European origin with a charming demeanour and strong (and often very correct) opinions on online teaching. Needless to say we got on extremely well.

I remember taking a peek at her screen, on one was our Learning Management System, the other a word document with a check list. It looked like some form of manual audit, trawling through each course and ticking whether certain resources existed.

My initial reaction was that of surprise: why isn’t this automated? Are we still in the 90s? Why go through hundreds of courses manually to check if they have resources? It’s time consuming and tedious. I remember telling my colleague: I’m going to find a way to automate this whole process.

The Solution

A few months later we gained access to a duplicate of the Learning Management System internal database (the data dump was hundreds of gigabytes in size). In addition it was not designed for reporting, it is one that’s designed to be the core or engine of a live system.

The data inside it was nearly incomprehensible. So I went to work in analysing it and designing queries: a great excuse to dust off my SQL skill set. I designed dozens of queries to extract useful data from the system. By using inner joins and a lot of trial and error I was able to design queries that extracted useful information from this database.

Of particular relevance was a query that scanned every single course that had existed in the system (20,000 plus at the time) and give a numerical breakdown of every resource type. With a single query I was able to extract the information that my colleague needed: something that was manual and took her weeks to do.

Over time I refined the query to add the course name and URL to the tabular data. The query was then saved as a database view.

I provided the data to colleagues in the faculty and they were able to use the data to produce basic PowerBI reports, but more importantly identify online courses that required a little ‘extra help’ and offer it to academics. It was immensely successful and an example of a great data driven decision.

PowerBI and Database Connection Magic

At the same time, I was using PowerBI. What if we connected PowerBI to the database with the ‘view’ and drew the data into the reporting software? I can then group the teaching tools into categories and create visualisations to understand the data better.

After several iterations I created a report that I was able to present it at a University roadshow. My pet project had evolved into a full blown enterprise project. Initially they called it ‘LEO audit’ but I took issue with the name, I suggested: ‘Using LEO’, which is far more ‘academic friendly’.

During one of the roadshows I presented the report on the PowerBI app on an iPad, a senior manager raised his hand and asked: ‘Can I have this now, please?’. If this wasn’t confirmation for the demand for data I don’t know what is.

Below is a sample of the report I created, again for an optimal experience please use a laptop or desktop to view the report (use the full screen option).

Categories
Azure Cloud Data Analysis Database PowerBI SQL

Learning Analytics

Samples of Work

During my working in Higher Education I have explored the realm of Learning Analytics: Using data to drive important teaching/business decisions. I have been given the opportunity to present my work at conferences and I derived great satisfaction from the positive feedback of attendees.

Interactive Reports

The most recent conference in question was the ACU Learning and Teaching Conference that was held in the North Sydney Campus of ACU. As part of a Learning Analytics showcase (part of the futures booth) I was tasked with creating numerous interactive PowerBI reports using teaching data derived from our Learning Management System.

Below is a sample exploration of the report.

Outcomes

The reports were loaded onto iPads and accessed via the official PowerBI app. What truly excited me about the app was that anyone could interact with the reports and ‘slice and dice’ the data available. I linked the reports to the ‘DIKW pyramid’, and the fact that reports could prompt users into investigating the data and discover patterns. This concept resonated with many of the attendees who were academics and taught online. It addressed an interesting problem that they encounter all the time: ‘Are my students interacting with the resources I have created?’.