Go-specific interview questions covering concurrency, interfaces, error handling, tooling, and idiomatic patterns.
Show a concrete pattern: "Every goroutine I launch receives a context. I use select with ctx.Done() so it exits cleanly when the parent cancels. I monitor goroutine counts in production dashboards."
Goroutines are lightweight, multiplexed onto OS threads by the Go runtime scheduler. Channels provide typed, synchronised communication between goroutines. Goroutine leaks happen when a goroutine blocks forever on a channel send/receive or waits on a resource that never resolves. Prevention strategies: always use context.Context with timeouts or cancellation, use select with a done channel, close channels when no more values will be sent, and use tools like runtime.NumGoroutine() or pprof to monitor goroutine counts in production.
Core Go skill. Candidates who cannot explain goroutine lifecycle management will write services that leak memory over time. Ask for a specific leak they have diagnosed.
Emphasise implicit satisfaction: "I define interfaces where they are consumed, not where they are implemented. This keeps packages decoupled and makes testing trivial — I just implement the two methods the function actually calls."
Go interfaces are satisfied implicitly — no "implements" keyword. This enables the consumer to define the interface they need rather than the producer declaring what it satisfies. Best practice: define small interfaces at the point of use, accept interfaces and return structs. Standard library interfaces like io.Reader, io.Writer, fmt.Stringer are composable building blocks. Strong candidates mention the "accept interfaces, return structs" principle and know when an interface with one or two methods is better than a large one.
Tests understanding of Go philosophy. Candidates from Java backgrounds often create large interfaces and declare them on the wrong side. Look for small, consumer-defined interfaces and understanding of why this matters for testing and decoupling.
Show a clear philosophy: "I wrap errors with context at each layer using fmt.Errorf with %w, define sentinel errors for expected conditions callers need to handle, and never use panic for normal error paths."
Go uses explicit error return values. Strong candidates discuss: wrapping errors with fmt.Errorf and %w for context while preserving the chain, using errors.Is() and errors.As() to inspect wrapped errors, defining sentinel errors (var ErrNotFound = errors.New(...)) for expected conditions, and custom error types for rich error information. They should mention that panic is reserved for truly unrecoverable situations, not control flow.
Fundamental Go pattern. Candidates who find error handling tedious and try to work around it (empty error checks, panic-recover for flow control) will write unreliable code. Look for disciplined, consistent error handling with proper context.
Show production awareness: "I use errgroup.Group with a concurrency limit. Each worker reads from a shared channel, processes, and sends results to an output channel. The group handles error propagation and cancellation."
Fan-out: launch multiple goroutines to process work from a shared channel. Fan-in: merge results from multiple goroutines into a single channel. Use cases: parallel HTTP calls, batch processing, pipeline stages. Pitfalls: unbounded goroutine creation (use a worker pool with a semaphore or fixed goroutine count), error propagation (use errgroup.Group), result ordering if needed, and clean shutdown via context cancellation. Strong candidates mention the sync and errgroup packages.
Senior concurrency question. Candidates who can whiteboard fan-out/fan-in with proper shutdown and error handling demonstrate real concurrent systems experience. Ask about backpressure handling.
Describe the pattern: "I use table-driven tests with t.Run for named subtests. Standard library assertions are clear enough for most cases. I add testify when I need generated mocks or complex assertion chains."
Table-driven tests define test cases as a slice of structs with inputs and expected outputs, then loop through them. This reduces boilerplate and makes adding cases trivial. Standard library testing package is sufficient for most needs with t.Run for subtests, t.Parallel for concurrent tests, and t.Helper for clean stack traces. Testify adds assertion helpers and mocks. Strong candidates prefer the standard library for simplicity and reach for testify only when assertion readability or mock generation adds genuine value.
Tests professional Go development practices. Candidates who do not know table-driven tests are likely inexperienced with Go. Those who immediately reach for heavy frameworks may not appreciate Go simplicity.
Highlight MVS: "Go uses minimum version selection, which is deterministic and avoids surprise upgrades. If two deps need v1.2 and v1.5 of a library, Go picks v1.5 — the minimum that satisfies both."
Go modules use semantic versioning with go.mod and go.sum files. Minimum version selection (MVS) picks the minimum version that satisfies all requirements — unlike npm or pip which pick the latest. For major version conflicts, Go treats v2+ as different module paths (import path includes /v2). Strong candidates explain MVS, the replace directive for local development, and how go mod tidy cleans up unused dependencies.
Baseline Go tooling knowledge. Candidates who cannot explain modules and dependency resolution will struggle with real projects. Ask about a diamond dependency problem they have resolved.
Show best practices: "Context is always the first parameter, never stored in a struct. I use WithTimeout for external calls, WithCancel for spawned goroutines, and only use Value for truly request-scoped metadata like trace IDs."
Context carries deadlines, cancellation signals, and request-scoped values across API boundaries. Correct usage: pass as the first parameter to functions, use WithTimeout/WithCancel for lifecycle management, check ctx.Err() or select on ctx.Done(). Incorrect usage: storing context in a struct, using context.Value for required dependencies (use function parameters instead), creating contexts with overly long timeouts, or ignoring cancellation in long-running operations.
Critical for production Go services. Candidates who store context in structs or use Value as a dependency injection mechanism will create maintenance problems. Look for understanding of cancellation propagation.
Give a practical example: "My service constructor accepts an interface for the database layer but returns *UserService. Tests pass a mock implementing just the two methods used. Callers get the full concrete type."
Functions should accept interface parameters to remain flexible and testable, but return concrete struct types so callers have full access to the implementation. Package design: keep packages small and focused, avoid package-level state, use internal/ for implementation details, and name packages by what they provide not what they contain. Strong candidates discuss the standard library as a model for package design and mention avoiding circular dependencies.
Tests architectural maturity in Go. Candidates from OOP backgrounds often create unnecessary abstractions. Go favours simplicity — look for flat package structures, minimal interfaces, and concrete return types.
Lead with tooling: "I expose /debug/pprof in staging, capture a 30-second CPU profile under load, and use go tool pprof to find the hottest code paths. For GC issues, I check the heap profile and tune GOGC."
Strong answers describe a systematic approach: enable pprof HTTP endpoints, capture CPU and memory profiles, use go tool pprof to analyse hot paths, check goroutine profiles for contention, examine trace output with go tool trace for scheduler latency. Common issues: lock contention (sync.Mutex), excessive GC pressure from allocations, goroutine leaks, and inefficient serialisation. Best candidates mention runtime metrics (GOGC, GOMEMLIMIT) and benchmarking with testing.B.
Senior Go question. Developers who cannot profile are guessing at performance problems. Those who know pprof, trace, and benchmarking can systematically improve performance. Ask about a specific performance issue they diagnosed.
Show balanced judgement: "I use generics for utility functions like slices.Contains and type-safe result types. I avoid them when a concrete type or small interface is clearer — Go readability matters more than type gymnastics."
Generics (Go 1.18+) eliminate the need for interface{}/any casts and code generation for type-safe collections and utilities. Good use cases: generic data structures (maps, sets, queues), utility functions (Map, Filter, Contains), and reducing boilerplate across similar types. Cases to avoid generics: when a simple interface suffices, when it makes code harder to read, or when the function only works with one or two types. Strong candidates show restraint — generics are a tool, not a goal.
Tests whether candidates adopt features thoughtfully. Those who use generics everywhere are fighting Go idioms. Those who refuse to use them are missing genuine improvements. Look for practical examples with clear trade-off reasoning.