golang dependency management – part 1

Since version 1.11, golang has introduced its own package dependency management capability – known as “Go Modules“. In this blog post, we would go through the steps to setup an application with REST and logging capabilities through integrating with some modules.

PS. The version of golang employed in this blog is 1.13

components and entities involved

features of the monitoring app

Basically, there are 2 REST api endpoints as follows:

  • PUT – /log/:id which serves as the ingestion point of any monitoring log / messages with a log_id
  • GET – /logs which retrieves all logged messages in the repository

To make things simple, the logged messages would be stored in a simple local file – feel free to change the repository to any persistent storages like RDBMS for more serious use-cases 🙂 The format for the logged messages would be in JSON format as well. To facilitate this, we would make use of a library zerolog

To provide REST api capabilities, we could simply use the http package shipped along golang distribution. However, to make life slightly easier, we could make use of the open-source community resources / contributions instead, the library chosen this time is go-json-rest

step 1: setup a main entry point

Every go program needs an entry point (unless you are writing a sharable library / module); like most traditions, we will create a file named “main.go

package main

import (
	"fmt"
)

func main()  {
	fmt.Println("in main, implementations TBD")
}

and also create a corresponding test file (suggested practice for TDD), “main_test.go

package main

import "testing"

func TestMonitorAppMain(t *testing.T) {
	main()
}

now you should be able to run “go test” and see all test PASS plus a line printed out like this “in main, implementations TBD“. Cool, the basic skeleton is done. Now add our server implementation in the “app” package

step 2: the “app” package

a new file named “monitor_app.go” is created with the initial skeleton like this:

package app

type MonitorApp struct {}

// create / factory (??) method for the monitor_app structure
func NewMonitorApp() *MonitorApp {
	pInstance := new(MonitorApp)
	return pInstance
}

// init the monitor_app
func (m *MonitorApp) Init() (err error) {
}

since we need to add REST capabilities to the monitor app; let’s import the go-json-rest module:

import "github.com/ant0ine/go-json-rest/rest"

if we now re-compile the code or if we were using an IDE, there should be some errors popping out telling us that the go-json-rest/rest api is not recognised or found. Definitely we need to really “import” them to the project… Let’s get back to a console / terminal and start to update the module dependencies.

but before that, go back to our “main.go” and modify the main func to create an instance of the monitor app:

package main

import (
	"github.com/quoeamaster/golang_blog_dependency_mgmt/app"
)

func main()  {
	app.NewMonitorApp()
}

we will need to import the “app” package and simply replace the standard out println to create the monitor app -> app.NewMonitorApp()

PS. when we import the “app” package, we MUST use the full url starting with github.com; if not we would face an issue that our code could ONLY run normally in a local development environment. The reason is importing only “app” means that a relative / local code path would be chosen and of course would run / compile perfectly, however if somebody else wants to run this code and didn’t download all related libraries / code-paths; compilation errors would pop up at once.

step 3: the “go mod” command

the “go mod” command is for us to create a module dependency config for the existing project, and we would need to specify the root package name for the dependency – REMEMBER the root package name!

go mod init github.com/quoeamaster/golang_blog_dependency_mgmt

after we started to init the process, we should be able to see a new file “go.mod” created with the initial contents:

module github.com/quoeamaster/golang_blog_dependency_mgmt

go 1.13

now run “go test” again and spot that modules would be downloaded for us and the updated go.mod would be like this:

module github.com/quoeamaster/golang_blog_dependency_mgmt

go 1.13

require (
	github.com/ant0ine/go-json-rest v3.3.2+incompatible
)

woot! The go-json-rest module has been imported for us, finally. Also at this point, the “go test” process should work fine again and giving us a PASS.

if you look carefully, there should be also another file named “go.sum” which declares the checksum for the go modules involved in this project:

github.com/ant0ine/go-json-rest v3.3.2+incompatible h1:nBixrkLFiDNAW0hauKDLc8yJI6XfrQumWvytE1Hk14E=
github.com/ant0ine/go-json-rest v3.3.2+incompatible/go.mod h1:q6aCt0GfU6LhpBsnZ/2U+mwe+0XB5WStbmwyoPfc+sk=
...
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=

interestingly, we did imported go-json-rest module, but there were also many other modules listed here… the idea is simple, each module we employed might also depend on other modules; hence when we import / download a module, its dependant module(s) would also be downloaded eventually. So that’s why even our app for now is so small in code size but the final executable built might still be rather “heavy”…

step 4: add in the REST features

once we got the go-json-rest module imported, let’s add back the missing pieces now. First setup the REST router with endpoints we would like to provide (e.g. GET /logs):

func (m *MonitorApp) Init() (err error) {
	// create REST api
	api := rest.NewApi()
	api.Use(rest.DefaultDevStack...)
	router, err := rest.MakeRouter(
		rest.Put("/log/:id", m.LogMsgWithId),
		rest.Get("/logs", m.GetAllLogs),
	)
	if err != nil {
		log.Fatal(err)
	}
	api.SetApp(router)
	log.Fatal(http.ListenAndServe(":8100", api.MakeHandler()))
}

we declared 2 endpoints here; 1 for an http GET operation under uri /logs and 1 for an http PUT operation under uri /log/:id (with a path param); hence 2 corresponding func would need to be provided:

// REST endpoint implementation for put /log/:id
func (m *MonitorApp) LogMsgWithId(w rest.ResponseWriter, req *rest.Request) {
	id := req.PathParam("id")

	defer req.Body.Close()
	bContent, err := ioutil.ReadAll(req.Body)
	if err != nil {
		panic(err)
	}
	// trim, parse json
	content := strings.Trim(string(bContent), "\n")
        m.Logger.Info().Str("id", id).Msg(content)
}

// REST endpoint implementation for get /logs
func (m *MonitorApp) GetAllLogs(w rest.ResponseWriter, req *rest.Request) {
	bContent, err := ioutil.ReadFile(m.LogFilePointer.Name())
	if err != nil {
		panic(err)
	}
	fmt.Println(string(bContent))
}

you can spot that there is a Logger object employed within the func(s); yep! You are right, we would need to import the logging module from zerolog

add the import statement:

import "github.com/rs/zerolog"

and the same thing, run a “go test” command and by miracle all the dependencies would be updated again~

module github.com/quoeamaster/golang_blog_dependency_mgmt

go 1.13

require (
	github.com/ant0ine/go-json-rest v3.3.2+incompatible
	github.com/rs/zerolog v1.17.2
)

step 5: test the app

by this moment of time, our monitor app is completed (yep, not perfect, but for demo purposes that’s way enough :))) ). Let’s do some testings on it, we would use the curl command to simulate REST calls (do feel free to use other tools like postman as well); 2 templates are provided within the github project and contents as below:

# get all logs in that repository (log file)

curl -XGET localhost:8100/logs

...

# curl with body

curl -XPUT localhost:8100/log/123456 -H "Content-Type: application/json" -d '
"message": "127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] \"GET /apache_pb.gif HTTP/1.0\" 200 2326 \"http://www.example.com/start.html\" \"Mozilla/4.08 [en] (Win98; I ;Nav)\""
'

now go to a console / terminal and paste the curl templates to execute; similar results are as follows:

standard out from the monitor_app

summary:

We have learnt a few things today, GREAT~~~

  • how to setup a module dependency for a go project – “go mod init {root-package-name}”
  • how to update the dependency of the project – add your imports and run the corresponding test files (e.g. main_test.go)
  • also… how to build a simple REST api app / server within minutes (thanks to the open-source contributors~)

what’s next???

For the coming part 2 of the blog, we would handle another use-case based on modules – how to create your own sharable modules plus versioning of modules (e.g. v1, v2, v3 …. ) Stay tuned and happy coding

:)))

resources

Read more "golang dependency management – part 1"

“Dating” Elasticsearch

For most datastore engines (such as database and noSQL engines), it is pretty common to work on the time/date data type. It seems a bit weird to talk about time/date handling as it is such a basic and simple operation… however if your application is targeting to work in a worldwide environment with multiple timezones […]

Read more "“Dating” Elasticsearch"

Ingest pipeline – tips #1

This is a series of tips for Elasticsearch Ingest Pipelines. In the 1st blog; we would go through a few useful techniques including: adding a current timestamp to the document (act as last_update_time) usage of the “pipeline” processor and a simple design pattern for re-using pipeline code exception handling on calling pipelines adding last_update_time to […]

Read more "Ingest pipeline – tips #1"

Elasticsearch Percolator – Part 2

In part 1 of the series, we have ingested 146,178 movie documents and created our 1st percolator query / user-preference, ultimately we also verified our query works against  the new incoming movie documents, the matched documents could simply act as a form of recommendation already. In this blog, we are going to explore another way […]

Read more "Elasticsearch Percolator – Part 2"

Elasticsearch Percolator – part 1

One of the coolest yet unknown feature in Elasticsearch (aka. ES) is Percoloator Query. Percolator helps to solve the use-case on matching documents with a certain criteria given. Usually we index documents to ES, but for Percolator to work, we index a QUERY describing the “certain criteria” instead! Yep~ We index a query; remember queries in […]

Read more "Elasticsearch Percolator – part 1"

When filebeat modules meet MySQL

Since filebeat 5.3, there is an add-on feature named as “modules“. We all know how easy to setup a filebeat to ingest log files. We also know how to apply “pre-processing” on the incoming documents / events by configuring ingestion pipeline(s) on an elasticsearch ingest node. Then why do still need to check the “modules” […]

Read more "When filebeat modules meet MySQL"