2026๋…„ Spring Boot ๋กœ๊น…: Logback๊ณผ JSON์œผ๋กœ ๊ตฌํ˜„ํ•˜๋Š” ์šด์˜ ํ™˜๊ฒฝ ๊ตฌ์กฐํ™” ๋กœ๊ทธ

Spring Boot ๊ตฌ์กฐํ™” ๋กœ๊น… ์™„๋ฒฝ ๊ฐ€์ด๋“œ์ž…๋‹ˆ๋‹ค. Logback JSON ์„ค์ •, ์ถ”์ ์šฉ MDC, ์šด์˜ ํ™˜๊ฒฝ ๋ชจ๋ฒ” ์‚ฌ๋ก€, ELK Stack ์—ฐ๋™์„ ๋‹ค๋ฃน๋‹ˆ๋‹ค.

Logback๊ณผ JSON์œผ๋กœ ๊ตฌํ˜„ํ•˜๋Š” Spring Boot ๊ตฌ์กฐํ™” ๋กœ๊น…

์ „ํ†ต์ ์ธ ํ…์ŠคํŠธ ๋กœ๊ทธ๋Š” ์šด์˜ ํ™˜๊ฒฝ์—์„œ ๋น ๋ฅด๊ฒŒ ๊ด€๋ฆฌ ๋ถˆ๊ฐ€๋Šฅํ•œ ์ƒํƒœ๊ฐ€ ๋ฉ๋‹ˆ๋‹ค. ์ˆ˜๋ฐฑ ๊ฐœ์˜ ์ธ์Šคํ„ด์Šค๊ฐ€ ์ดˆ๋‹น ์ˆ˜์ฒœ ์ค„์˜ ๋กœ๊ทธ๋ฅผ ์ƒ์„ฑํ•˜๋ฉด ํŠน์ • ์˜ค๋ฅ˜๋ฅผ ์ฐพ๋Š” ์ผ์€ ์•…๋ชฝ์ด ๋ฉ๋‹ˆ๋‹ค. JSON ํ˜•์‹์˜ ๊ตฌ์กฐํ™” ๋กœ๊ทธ๋Š” ๋ชจ๋“  ์ด๋ฒคํŠธ๋ฅผ ์ฟผ๋ฆฌ ๊ฐ€๋Šฅํ•˜๊ณ  ์ž๋™์œผ๋กœ ๋ถ„์„ ๊ฐ€๋Šฅํ•˜๊ฒŒ ๋งŒ๋“ค์–ด ์ด๋Ÿฌํ•œ ์ƒํ™ฉ์„ ์™„์ „ํžˆ ๋ฐ”๊ฟ‰๋‹ˆ๋‹ค.

ํ•ต์‹ฌ ํฌ์ธํŠธ

Spring Boot 3.4 ์ด์ƒ์€ ์™ธ๋ถ€ ์˜์กด์„ฑ ์—†์ด ๊ตฌ์กฐํ™”๋œ JSON ๋กœ๊น…์„ ๋„ค์ดํ‹ฐ๋ธŒ๋กœ ์ง€์›ํ•ฉ๋‹ˆ๋‹ค. ์ด์ „ ๋ฒ„์ „์—์„œ๋Š” Logback Logstash Encoder๊ฐ€ ์—ฌ์ „ํžˆ ํ‘œ์ค€ ์†”๋ฃจ์…˜์ž…๋‹ˆ๋‹ค.

๊ตฌ์กฐํ™” ๋กœ๊ทธ๋ฅผ ๋„์ž…ํ•˜๋Š” ์ด์œ 

์ „ํ†ต์ ์ธ ํ…์ŠคํŠธ ๋กœ๊ทธ์˜ ํ•œ๊ณ„

์ „ํ˜•์ ์ธ ํ…์ŠคํŠธ ๋กœ๊ทธ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋ชจ์Šต์ž…๋‹ˆ๋‹ค.

text
2026-03-27 10:15:32.456 INFO  [order-service,abc123] c.e.s.OrderService - Order created for user john@example.com, amount: 150.00โ‚ฌ, items: 3

์ด ํ˜•์‹์€ ์šด์˜ ํ™˜๊ฒฝ์—์„œ ์—ฌ๋Ÿฌ ๋ฌธ์ œ๋ฅผ ๋ฐœ์ƒ์‹œํ‚ต๋‹ˆ๋‹ค. ํŠน์ • ์ •๋ณด๋ฅผ ์ถ”์ถœํ•˜๋ ค๋ฉด ๋ณต์žกํ•˜๊ณ  ๊นจ์ง€๊ธฐ ์‰ฌ์šด ์ •๊ทœ ํ‘œํ˜„์‹์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค. ์„œ๋น„์Šค ๊ฐ„ ์ƒ๊ด€๊ด€๊ณ„ ๋ถ„์„์—๋Š” ์—„๊ฒฉํ•œ ๊ทœ์•ฝ์ด ํ•„์š”ํ•œ๋ฐ ํŒ€๋งˆ๋‹ค ํ•ด์„์ด ๋‹ฌ๋ผ์ง‘๋‹ˆ๋‹ค. Elasticsearch ๊ฐ™์€ ๋ถ„์„ ๋„๊ตฌ๋Š” ์ด๋Ÿฌํ•œ ๋น„๊ตฌ์กฐํ™” ๋ฌธ์ž์—ด์„ ํšจ์œจ์ ์œผ๋กœ ์ธ๋ฑ์‹ฑํ•˜๊ธฐ ์–ด๋ ต์Šต๋‹ˆ๋‹ค.

JSON ํ˜•์‹์˜ ์žฅ์ 

๊ฐ™์€ ์ด๋ฒคํŠธ๋ฅผ JSON์œผ๋กœ ํ‘œํ˜„ํ•˜๋ฉด ์ฆ‰์‹œ ํ™œ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

json
{
  "@timestamp": "2026-03-27T10:15:32.456Z",
  "level": "INFO",
  "logger": "com.example.service.OrderService",
  "message": "Order created",
  "service": "order-service",
  "traceId": "abc123",
  "userId": "john@example.com",
  "orderId": "ORD-789456",
  "amount": 150.00,
  "currency": "EUR",
  "itemCount": 3
}

๋ชจ๋“  ํ•„๋“œ๋ฅผ ํ•„ํ„ฐ๋งํ•˜๊ณ  ์ง‘๊ณ„ํ•  ์ˆ˜ ์žˆ๊ฒŒ ๋ฉ๋‹ˆ๋‹ค. Elasticsearch ์ฟผ๋ฆฌ๋กœ ์ตœ๊ทผ 15๋ถ„ ๋™์•ˆ 100์œ ๋กœ๋ฅผ ์ดˆ๊ณผํ•˜๋Š” ์ฃผ๋ฌธ ์ „์ฒด๋ฅผ ์ฆ‰์‹œ ์ฐพ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. Kibana ๋Œ€์‹œ๋ณด๋“œ๋Š” ์ˆ˜๋™ ํŒŒ์‹ฑ ์—†์ด ํŠธ๋ Œ๋“œ๋ฅผ ์‹œ๊ฐํ™”ํ•ฉ๋‹ˆ๋‹ค.

Spring Boot 3.4 ์ด์ƒ์˜ ๋„ค์ดํ‹ฐ๋ธŒ ์„ค์ •

๊ตฌ์กฐํ™” JSON ๋กœ๊ทธ ํ™œ์„ฑํ™”

Spring Boot 3.4๋Š” logging.structured ์†์„ฑ์„ ํ†ตํ•ด ๊ตฌ์กฐํ™” ๋กœ๊น…์— ๋Œ€ํ•œ ๋„ค์ดํ‹ฐ๋ธŒ ์ง€์›์„ ๋„์ž…ํ–ˆ์Šต๋‹ˆ๋‹ค. ์ด ๋ฐฉ์‹์€ ์ถ”๊ฐ€ ์˜์กด์„ฑ์„ ์ „ํ˜€ ํ•„์š”๋กœ ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค.

yaml
# application.yml
# Native structured logging configuration for Spring Boot 3.4+
logging:
  structured:
    # Output format: ecs (Elastic), logstash, gelf
    format:
      console: ecs
      file: ecs
  file:
    name: /var/log/app/application.log
  level:
    root: INFO
    com.example: DEBUG

ECS(Elastic Common Schema) ํ˜•์‹์€ ์ถ”๊ฐ€ ์„ค์ • ์—†์ด Elasticsearch์™€ Kibana์™€์˜ ์ง์ ‘์ ์ธ ํ˜ธํ™˜์„ฑ์„ ๋ณด์žฅํ•ฉ๋‹ˆ๋‹ค.

JSON ํ•„๋“œ ์ปค์Šคํ„ฐ๋งˆ์ด์ง•

๊ฐ ๋กœ๊ทธ์— ๋น„์ฆˆ๋‹ˆ์Šค ํ•„๋“œ๋ฅผ ์ถ”๊ฐ€ํ•˜๊ธฐ ์œ„ํ•ด Spring Boot๋Š” ์ถ”๊ฐ€ ์†์„ฑ ์„ค์ •์„ ํ—ˆ์šฉํ•ฉ๋‹ˆ๋‹ค.

yaml
# application.yml
# Custom fields in structured logs
logging:
  structured:
    format:
      console: ecs
    ecs:
      # Service information added to every log
      service:
        name: ${spring.application.name}
        version: ${app.version:1.0.0}
        environment: ${spring.profiles.active:default}
        node-name: ${HOSTNAME:unknown}
LoggingConfig.javajava
// Programmatic configuration for additional fields
package com.example.logging.config;

import org.springframework.boot.logging.structured.StructuredLogFormatterCustomizer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
public class LoggingConfig {

    @Bean
    StructuredLogFormatterCustomizer<EcsStructuredLogFormatter> ecsCustomizer() {
        return formatter -> formatter
            // Adds static fields to all logs
            .addStaticField("team", "backend")
            .addStaticField("region", System.getenv("AWS_REGION"))
            // Customizes exception formatting
            .setIncludeStacktrace(true)
            .setStacktraceMaxLength(5000);
    }
}

์ด ํ•„๋“œ๋“ค์€ ๋ชจ๋“  ๋กœ๊ทธ ๋ผ์ธ์— ํ‘œ์‹œ๋˜์–ด ๋Œ€์‹œ๋ณด๋“œ์—์„œ ํŒ€์ด๋‚˜ ๋ฆฌ์ „๋ณ„๋กœ ํ•„ํ„ฐ๋งํ•˜๊ธฐ ์‰ฝ๊ฒŒ ๋งŒ๋“ญ๋‹ˆ๋‹ค.

JSON ์ธ์ฝ”๋”๋ฅผ ์‚ฌ์šฉํ•œ ํด๋ž˜์‹ Logback ์„ค์ •

Logstash Encoder ์˜์กด์„ฑ

Spring Boot 3.4 ๋ฏธ๋งŒ ๋ฒ„์ „์ด๊ฑฐ๋‚˜ ๊ณ ๊ธ‰ ์ปค์Šคํ„ฐ๋งˆ์ด์ง•์ด ํ•„์š”ํ•œ ๊ฒฝ์šฐ, Logstash Logback Encoder๊ฐ€ ์—ฌ์ „ํžˆ ํ‘œ์ค€ ์†”๋ฃจ์…˜์œผ๋กœ ๋‚จ์•„ ์žˆ์Šต๋‹ˆ๋‹ค.

xml
<!-- pom.xml -->
<!-- Dependency for JSON logging with Logback -->
<dependency>
    <groupId>net.logstash.logback</groupId>
    <artifactId>logstash-logback-encoder</artifactId>
    <version>7.4</version>
</dependency>

์ „์ฒด Logback ์„ค์ •

logback-spring.xml ํŒŒ์ผ์€ ์ถœ๋ ฅ ํ˜•์‹์— ๋Œ€ํ•œ ์™„์ „ํ•œ ์ œ์–ด๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

xml
<!-- src/main/resources/logback-spring.xml -->
<!-- Logback configuration for structured JSON logs -->
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
    <!-- Spring Boot properties -->
    <springProperty scope="context" name="appName" source="spring.application.name" defaultValue="app"/>
    <springProperty scope="context" name="appVersion" source="app.version" defaultValue="1.0.0"/>

    <!-- JSON console appender for production -->
    <appender name="JSON_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <!-- Custom fields added to every log -->
            <customFields>{"service":"${appName}","version":"${appVersion}"}</customFields>
            <!-- Includes MDC (tracing context) -->
            <includeMdcKeyName>traceId</includeMdcKeyName>
            <includeMdcKeyName>spanId</includeMdcKeyName>
            <includeMdcKeyName>userId</includeMdcKeyName>
            <includeMdcKeyName>requestId</includeMdcKeyName>
            <!-- ISO8601 timestamp format -->
            <timestampPattern>yyyy-MM-dd'T'HH:mm:ss.SSSZ</timestampPattern>
            <!-- Complete stack traces -->
            <throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
                <maxDepthPerThrowable>30</maxDepthPerThrowable>
                <maxLength>4096</maxLength>
                <shortenedClassNameLength>36</shortenedClassNameLength>
                <rootCauseFirst>true</rootCauseFirst>
            </throwableConverter>
        </encoder>
    </appender>

    <!-- Rolling JSON file appender -->
    <appender name="JSON_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>/var/log/${appName}/application.json</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <fileNamePattern>/var/log/${appName}/application.%d{yyyy-MM-dd}.%i.json.gz</fileNamePattern>
            <maxHistory>30</maxHistory>
            <maxFileSize>100MB</maxFileSize>
            <totalSizeCap>3GB</totalSizeCap>
        </rollingPolicy>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
            <customFields>{"service":"${appName}","version":"${appVersion}"}</customFields>
        </encoder>
    </appender>

    <!-- Text appender for development -->
    <appender name="TEXT_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n</pattern>
        </encoder>
    </appender>

    <!-- Activation by Spring profile -->
    <springProfile name="prod,staging">
        <root level="INFO">
            <appender-ref ref="JSON_CONSOLE"/>
            <appender-ref ref="JSON_FILE"/>
        </root>
    </springProfile>

    <springProfile name="dev,local">
        <root level="DEBUG">
            <appender-ref ref="TEXT_CONSOLE"/>
        </root>
    </springProfile>
</configuration>

์ด ์„ค์ •์€ ์šด์˜ ํ™˜๊ฒฝ์—์„œ๋งŒ JSON ๋กœ๊ทธ๋ฅผ ํ™œ์„ฑํ™”ํ•˜๊ณ  ๊ฐœ๋ฐœ ํ™˜๊ฒฝ์—์„œ๋Š” ๊ฐ€๋…์„ฑ ์ข‹์€ ๋กœ๊ทธ๋ฅผ ์œ ์ง€ํ•ฉ๋‹ˆ๋‹ค.

Spring ํ”„๋กœํŒŒ์ผ

<springProfile>์„ ์‚ฌ์šฉํ•˜๋ฉด ์„ค์ •์„ ๋ณ€๊ฒฝํ•˜์ง€ ์•Š๊ณ ๋„ ํ™˜๊ฒฝ์— ๋”ฐ๋ผ ํ…์ŠคํŠธ ํ˜•์‹๊ณผ JSON ํ˜•์‹ ์‚ฌ์ด๋ฅผ ์ž๋™์œผ๋กœ ์ „ํ™˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

๋ถ„์‚ฐ ์ถ”์ ์„ ์œ„ํ•œ MDC

์ถ”์  ์ปจํ…์ŠคํŠธ ์ „ํŒŒ

MDC(Mapped Diagnostic Context)๋Š” ์š”์ฒญ์ด๋‚˜ ์ถ”์  ์‹๋ณ„์ž ๊ฐ™์€ ์ปจํ…์ŠคํŠธ ์ •๋ณด๋ฅผ ๋ชจ๋“  ๋กœ๊ทธ์— ์ถ”๊ฐ€ํ•ฉ๋‹ˆ๋‹ค.

TracingFilter.javajava
// Filter for automatic trace context injection
package com.example.logging.filter;

import jakarta.servlet.FilterChain;
import jakarta.servlet.ServletException;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
import org.slf4j.MDC;
import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Component;
import org.springframework.web.filter.OncePerRequestFilter;

import java.io.IOException;
import java.util.UUID;

@Component
@Order(Ordered.HIGHEST_PRECEDENCE)
public class TracingFilter extends OncePerRequestFilter {

    // Standard MDC keys for tracing
    private static final String TRACE_ID_KEY = "traceId";
    private static final String SPAN_ID_KEY = "spanId";
    private static final String REQUEST_ID_KEY = "requestId";
    private static final String USER_ID_KEY = "userId";

    @Override
    protected void doFilterInternal(
            HttpServletRequest request,
            HttpServletResponse response,
            FilterChain filterChain) throws ServletException, IOException {

        try {
            // Retrieve or generate trace identifiers
            String traceId = extractOrGenerate(request, "X-Trace-Id", TRACE_ID_KEY);
            String spanId = generateSpanId();
            String requestId = extractOrGenerate(request, "X-Request-Id", REQUEST_ID_KEY);
            String userId = request.getHeader("X-User-Id");

            // Inject into MDC to appear in all logs
            MDC.put(TRACE_ID_KEY, traceId);
            MDC.put(SPAN_ID_KEY, spanId);
            MDC.put(REQUEST_ID_KEY, requestId);
            if (userId != null) {
                MDC.put(USER_ID_KEY, userId);
            }

            // Propagate to responses for inter-service chaining
            response.setHeader("X-Trace-Id", traceId);
            response.setHeader("X-Request-Id", requestId);

            filterChain.doFilter(request, response);

        } finally {
            // Clean MDC after each request
            MDC.clear();
        }
    }

    private String extractOrGenerate(HttpServletRequest request, String header, String key) {
        String value = request.getHeader(header);
        return value != null ? value : UUID.randomUUID().toString().replace("-", "").substring(0, 16);
    }

    private String generateSpanId() {
        return UUID.randomUUID().toString().replace("-", "").substring(0, 8);
    }
}

์š”์ฒญ ์ฒ˜๋ฆฌ ๋„์ค‘ ๋ฐœ์ƒํ•˜๋Š” ๋ชจ๋“  ๋กœ๊ทธ์—๋Š” ์ด ์‹๋ณ„์ž๋“ค์ด ์ž๋™์œผ๋กœ ํฌํ•จ๋ฉ๋‹ˆ๋‹ค.

๋น„์ฆˆ๋‹ˆ์Šค ์ฝ”๋“œ์—์„œ MDC ์‚ฌ์šฉ

OrderService.javajava
// Business service with enriched contextual logging
package com.example.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
import org.springframework.stereotype.Service;

@Service
public class OrderService {

    private static final Logger log = LoggerFactory.getLogger(OrderService.class);

    public Order createOrder(CreateOrderRequest request) {
        // Add business information to MDC context
        MDC.put("orderId", request.getOrderId());
        MDC.put("customerId", request.getCustomerId());

        try {
            log.info("Creating order with {} items", request.getItems().size());

            // Business logic...
            Order order = processOrder(request);

            log.info("Order created successfully, total: {} {}",
                order.getTotal(), order.getCurrency());

            return order;

        } catch (Exception e) {
            // Exception appears with full MDC context
            log.error("Failed to create order", e);
            throw e;
        } finally {
            // Clean business keys added
            MDC.remove("orderId");
            MDC.remove("customerId");
        }
    }
}

์ƒ์„ฑ๋œ JSON ๋กœ๊ทธ์—๋Š” ๋””๋ฒ„๊น…์— ํ•„์š”ํ•œ ๋ชจ๋“  ์ •๋ณด๊ฐ€ ํฌํ•จ๋ฉ๋‹ˆ๋‹ค.

json
{
  "@timestamp": "2026-03-27T10:15:32.456Z",
  "level": "INFO",
  "logger": "com.example.service.OrderService",
  "message": "Order created successfully, total: 150.00 EUR",
  "traceId": "a1b2c3d4e5f67890",
  "spanId": "12345678",
  "requestId": "req-abc-123",
  "userId": "user-456",
  "orderId": "ORD-789",
  "customerId": "CUST-321"
}

Spring Boot ๋ฉด์ ‘ ์ค€๋น„๊ฐ€ ๋˜์…จ๋‚˜์š”?

์ธํ„ฐ๋ž™ํ‹ฐ๋ธŒ ์‹œ๋ฎฌ๋ ˆ์ดํ„ฐ, flashcards, ๊ธฐ์ˆ  ํ…Œ์ŠคํŠธ๋กœ ์—ฐ์Šตํ•˜์„ธ์š”.

์„ฑ๋Šฅ์„ ์œ„ํ•œ ๋น„๋™๊ธฐ ๋กœ๊น…

์Šค๋ ˆ๋“œ ํ’€ ์„ค์ •

์šด์˜ ํ™˜๊ฒฝ์—์„œ ๋™๊ธฐ์ ์ธ ๋กœ๊ทธ ์“ฐ๊ธฐ๋Š” ์š”์ฒญ ์ง€์—ฐ ์‹œ๊ฐ„์— ์˜ํ–ฅ์„ ์ค๋‹ˆ๋‹ค. ๋น„๋™๊ธฐ ์–ดํŽœ๋”๋Š” ๋ฉ”์ธ ์Šค๋ ˆ๋“œ์™€ ๋กœ๊น…์„ ๋ถ„๋ฆฌํ•ฉ๋‹ˆ๋‹ค.

xml
<!-- logback-spring.xml -->
<!-- High-performance asynchronous appender configuration -->
<appender name="ASYNC_JSON" class="ch.qos.logback.classic.AsyncAppender">
    <!-- Pending log buffer size -->
    <queueSize>1024</queueSize>
    <!-- Never block the calling thread -->
    <neverBlock>true</neverBlock>
    <!-- Threshold before dropping DEBUG/TRACE logs -->
    <discardingThreshold>20</discardingThreshold>
    <!-- Include caller information (expensive) -->
    <includeCallerData>false</includeCallerData>
    <!-- Actual appender for writing -->
    <appender-ref ref="JSON_FILE"/>
</appender>

<springProfile name="prod">
    <root level="INFO">
        <appender-ref ref="ASYNC_JSON"/>
    </root>
</springProfile>

๋กœ๊น… ์‹œ์Šคํ…œ ๋ฉ”ํŠธ๋ฆญ

๋กœ๊น… ์‹œ์Šคํ…œ ์ž์ฒด๋ฅผ ๋ชจ๋‹ˆํ„ฐ๋งํ•˜๋ฉด ์กฐ์šฉํ•œ ๋กœ๊ทธ ์†์‹ค์„ ๋ฐฉ์ง€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

LoggingMetrics.javajava
// Exposing Logback metrics via Micrometer
package com.example.logging.metrics;

import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.Appender;
import ch.qos.logback.classic.AsyncAppender;
import io.micrometer.core.instrument.Gauge;
import io.micrometer.core.instrument.MeterRegistry;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;

import jakarta.annotation.PostConstruct;
import java.util.Iterator;

@Component
public class LoggingMetrics {

    private final MeterRegistry registry;

    public LoggingMetrics(MeterRegistry registry) {
        this.registry = registry;
    }

    @PostConstruct
    void registerMetrics() {
        LoggerContext context = (LoggerContext) LoggerFactory.getILoggerFactory();
        Logger rootLogger = context.getLogger(Logger.ROOT_LOGGER_NAME);

        // Iterate through appenders to find AsyncAppenders
        Iterator<Appender<ILoggingEvent>> it = rootLogger.iteratorForAppenders();
        while (it.hasNext()) {
            Appender<ILoggingEvent> appender = it.next();
            if (appender instanceof AsyncAppender asyncAppender) {
                registerAsyncMetrics(asyncAppender);
            }
        }
    }

    private void registerAsyncMetrics(AsyncAppender appender) {
        String appenderName = appender.getName();

        // Current queue size
        Gauge.builder("logback.async.queue.size", appender, AsyncAppender::getQueueSize)
            .tag("appender", appenderName)
            .description("Current async appender queue size")
            .register(registry);

        // Remaining capacity
        Gauge.builder("logback.async.queue.remaining", appender, AsyncAppender::getRemainingCapacity)
            .tag("appender", appenderName)
            .description("Remaining capacity in async queue")
            .register(registry);

        // Number of dropped logs
        Gauge.builder("logback.async.discarded", appender, AsyncAppender::getNumberOfElementsInQueue)
            .tag("appender", appenderName)
            .description("Number of discarded log events")
            .register(registry);
    }
}

logback.async.queue.remaining < 100์— ๋Œ€ํ•œ Prometheus ์•Œ๋žŒ์€ ๋กœ๊ทธ ์†์‹ค ์œ„ํ—˜์„ ์‚ฌ์ „์— ์•Œ๋ ค์ค๋‹ˆ๋‹ค.

ELK Stack ์—ฐ๋™

Filebeat ์„ค์ •

Filebeat๋Š” JSON ํŒŒ์ผ์„ ์ˆ˜์ง‘ํ•ด ๋ณ€ํ™˜ ์—†์ด Elasticsearch๋กœ ์ „์†กํ•ฉ๋‹ˆ๋‹ค.

yaml
# filebeat.yml
# Filebeat configuration for Spring Boot JSON logs
filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - /var/log/*/application.json
    # Automatic JSON parsing
    json:
      keys_under_root: true
      overwrite_keys: true
      add_error_key: true
      message_key: message

processors:
  # Add Kubernetes metadata if available
  - add_kubernetes_metadata:
      host: ${NODE_NAME}
      matchers:
        - logs_path:
            logs_path: "/var/log/containers/"
  # Parse timestamp
  - timestamp:
      field: "@timestamp"
      layouts:
        - '2006-01-02T15:04:05.000Z'
        - '2006-01-02T15:04:05.000-07:00'
      test:
        - '2026-03-27T10:15:32.456Z'

output.elasticsearch:
  hosts: ["elasticsearch:9200"]
  index: "logs-%{[service]}-%{+yyyy.MM.dd}"
  pipeline: "spring-boot-logs"

setup.template:
  name: "logs"
  pattern: "logs-*"

๋ฐ์ดํ„ฐ ๋ณด๊ฐ•์„ ์œ„ํ•œ Elasticsearch ํŒŒ์ดํ”„๋ผ์ธ

json
// PUT _ingest/pipeline/spring-boot-logs
{
  "description": "Spring Boot logs enrichment",
  "processors": [
    {
      "geoip": {
        "field": "client.ip",
        "target_field": "client.geo",
        "ignore_missing": true
      }
    },
    {
      "user_agent": {
        "field": "user_agent.original",
        "target_field": "user_agent",
        "ignore_missing": true
      }
    },
    {
      "set": {
        "field": "event.ingested",
        "value": "{{_ingest.timestamp}}"
      }
    },
    {
      "script": {
        "description": "Classify log level severity",
        "source": """
          def level = ctx.level;
          if (level == 'ERROR') ctx.severity = 4;
          else if (level == 'WARN') ctx.severity = 3;
          else if (level == 'INFO') ctx.severity = 2;
          else ctx.severity = 1;
        """
      }
    }
  ]
}

์šด์˜ ํ™˜๊ฒฝ ๋ชจ๋ฒ” ์‚ฌ๋ก€

์ฒด๊ณ„์ ์œผ๋กœ ํฌํ•จํ•ด์•ผ ํ•  ์ •๋ณด

๊ฐ ๋กœ๊ทธ์—๋Š” ๋””๋ฒ„๊น…๊ณผ ์ƒ๊ด€๊ด€๊ณ„ ๋ถ„์„์„ ์œ„ํ•œ ์ตœ์†Œ ์ •๋ณด๊ฐ€ ํฌํ•จ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.

StructuredLogger.javajava
// Helper for consistent structured logs
package com.example.logging;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;

import java.util.Map;
import java.util.function.Supplier;

public final class StructuredLogger {

    private final Logger delegate;

    private StructuredLogger(Class<?> clazz) {
        this.delegate = LoggerFactory.getLogger(clazz);
    }

    public static StructuredLogger getLogger(Class<?> clazz) {
        return new StructuredLogger(clazz);
    }

    // Log with temporary business context
    public void info(String message, Map<String, String> context) {
        try {
            context.forEach(MDC::put);
            delegate.info(message);
        } finally {
            context.keySet().forEach(MDC::remove);
        }
    }

    // Log with supplier for lazy evaluation
    public void debug(Supplier<String> messageSupplier, Map<String, String> context) {
        if (delegate.isDebugEnabled()) {
            try {
                context.forEach(MDC::put);
                delegate.debug(messageSupplier.get());
            } finally {
                context.keySet().forEach(MDC::remove);
            }
        }
    }

    // Error log with full context
    public void error(String message, Throwable t, Map<String, String> context) {
        try {
            context.forEach(MDC::put);
            delegate.error(message, t);
        } finally {
            context.keySet().forEach(MDC::remove);
        }
    }
}
java
// Usage in business code
private static final StructuredLogger log = StructuredLogger.getLogger(PaymentService.class);

public void processPayment(Payment payment) {
    log.info("Processing payment", Map.of(
        "paymentId", payment.getId(),
        "amount", String.valueOf(payment.getAmount()),
        "currency", payment.getCurrency(),
        "method", payment.getMethod().name()
    ));
}

์ œ์™ธํ•ด์•ผ ํ•  ๋ฏผ๊ฐ ์ •๋ณด

๋กœ๊ทธ์—๋Š” ์ ˆ๋Œ€๋กœ ๊ฐœ์ธ ์ •๋ณด๋‚˜ ๋ฏผ๊ฐ ๋ฐ์ดํ„ฐ๋ฅผ ํฌํ•จํ•ด์„œ๋Š” ์•ˆ ๋ฉ๋‹ˆ๋‹ค.

SensitiveDataFilter.javajava
// Sensitive data masking filter
package com.example.logging.filter;

import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;

import java.util.regex.Pattern;

public class SensitiveDataFilter extends Filter<ILoggingEvent> {

    // Sensitive data patterns to mask
    private static final Pattern EMAIL_PATTERN =
        Pattern.compile("[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\\.[a-zA-Z]{2,}");
    private static final Pattern CREDIT_CARD_PATTERN =
        Pattern.compile("\\b\\d{4}[- ]?\\d{4}[- ]?\\d{4}[- ]?\\d{4}\\b");
    private static final Pattern PASSWORD_PATTERN =
        Pattern.compile("(?i)(password|pwd|secret|token)[\"']?\\s*[:=]\\s*[\"']?[^\\s,}\"']+");
    private static final Pattern PHONE_PATTERN =
        Pattern.compile("\\+?\\d{1,3}[- ]?\\d{6,14}");

    @Override
    public FilterReply decide(ILoggingEvent event) {
        // Accept all logs but modify the message
        // Note: for real masking, use a custom converter
        return FilterReply.NEUTRAL;
    }

    // Utility method to mask data
    public static String maskSensitiveData(String input) {
        if (input == null) return null;

        String result = input;
        result = EMAIL_PATTERN.matcher(result).replaceAll("[EMAIL_MASKED]");
        result = CREDIT_CARD_PATTERN.matcher(result).replaceAll("[CARD_MASKED]");
        result = PASSWORD_PATTERN.matcher(result).replaceAll("$1=[REDACTED]");
        result = PHONE_PATTERN.matcher(result).replaceAll("[PHONE_MASKED]");

        return result;
    }
}
๊ฐœ์ธ์ •๋ณด ๋ณดํ˜ธ ๋ฐ ์ปดํ”Œ๋ผ์ด์–ธ์Šค

๊ฐœ์ธ ์ •๋ณด๋ฅผ ํฌํ•จํ•œ ๋กœ๊ทธ๋Š” GDPR์ด๋‚˜ ํ•œ๊ตญ ๊ฐœ์ธ์ •๋ณด๋ณดํ˜ธ๋ฒ•์˜ ์ ์šฉ์„ ๋ฐ›์Šต๋‹ˆ๋‹ค. IP ์ฃผ์†Œ, ์ด๋ฉ”์ผ, ์‚ฌ์šฉ์ž ์‹๋ณ„์ž์—๋Š” ๋ณด์กด ์ •์ฑ…๊ณผ ํ•„์š”ํ•œ ๊ฒฝ์šฐ ๋™์˜ ์ ˆ์ฐจ๊ฐ€ ์žˆ์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.

์ ์ ˆํ•œ ๋กœ๊ทธ ๋ ˆ๋ฒจ

LogLevelGuidelines.javajava
// Appropriate log level guidelines
package com.example.logging;

public class LogLevelGuidelines {

    // ERROR: Failure requiring intervention
    // - Unrecoverable exceptions
    // - Critical transaction failures
    // - External service unavailability
    log.error("Payment gateway unreachable after 3 retries", exception);

    // WARN: Abnormal but handled situation
    // - Retry in progress
    // - Performance degradation
    // - Resources near limits
    log.warn("Database connection pool at 85% capacity");

    // INFO: Significant business events
    // - Transaction start/end
    // - Important state changes
    // - Key user actions
    log.info("Order {} shipped to customer {}", orderId, customerId);

    // DEBUG: Diagnostic information
    // - Execution details
    // - Important variable values
    // - Branching decisions
    log.debug("Cache miss for key {}, fetching from database", cacheKey);

    // TRACE: Very fine details
    // - Method entry/exit
    // - Complete object contents
    // - Loops and iterations
    log.trace("Processing item {} of {}", index, total);
}

๋กœ๊ทธ ํ…Œ์ŠคํŠธ์™€ ๊ฒ€์ฆ

JSON ๊ตฌ์กฐ์— ๋Œ€ํ•œ ๋‹จ์œ„ ํ…Œ์ŠคํŠธ

StructuredLoggingTest.javajava
// Structured log validation tests
package com.example.logging;

import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.read.ListAppender;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;

import static org.assertj.core.api.Assertions.assertThat;

class StructuredLoggingTest {

    private ListAppender<ILoggingEvent> listAppender;
    private Logger logger;
    private ObjectMapper objectMapper;

    @BeforeEach
    void setUp() {
        logger = (Logger) LoggerFactory.getLogger(StructuredLoggingTest.class);
        listAppender = new ListAppender<>();
        listAppender.start();
        logger.addAppender(listAppender);
        objectMapper = new ObjectMapper();
    }

    @Test
    void shouldIncludeMdcFieldsInLog() {
        // Given
        MDC.put("traceId", "test-trace-123");
        MDC.put("userId", "user-456");

        // When
        logger.info("Test message with MDC context");

        // Then
        ILoggingEvent event = listAppender.list.get(0);
        assertThat(event.getMDCPropertyMap())
            .containsEntry("traceId", "test-trace-123")
            .containsEntry("userId", "user-456");

        MDC.clear();
    }

    @Test
    void shouldLogExceptionWithStackTrace() {
        // Given
        Exception testException = new RuntimeException("Test error");

        // When
        logger.error("Operation failed", testException);

        // Then
        ILoggingEvent event = listAppender.list.get(0);
        assertThat(event.getThrowableProxy()).isNotNull();
        assertThat(event.getThrowableProxy().getMessage()).isEqualTo("Test error");
    }
}

๊ฒฐ๋ก 

๊ตฌ์กฐํ™”๋œ JSON ๋กœ๊ทธ๋Š” Spring Boot ์• ํ”Œ๋ฆฌ์ผ€์ด์…˜์˜ ์˜ต์ €๋ฒ„๋นŒ๋ฆฌํ‹ฐ๋ฅผ ๋ณ€ํ™”์‹œํ‚ต๋‹ˆ๋‹ค.

โœ… ์ฟผ๋ฆฌ ๊ฐ€๋Šฅ: ๋ชจ๋“  ํ•„๋“œ๋ฅผ Elasticsearch๋‚˜ CloudWatch์—์„œ ํ•„ํ„ฐ๋งํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค

โœ… ์ƒ๊ด€ ๋ถ„์„ ๊ฐ€๋Šฅ: MDC๊ฐ€ ์ถ”์  ์‹๋ณ„์ž๋ฅผ ์„œ๋น„์Šค ๊ฐ„์— ์ „ํŒŒํ•ฉ๋‹ˆ๋‹ค

โœ… ๊ณ ์„ฑ๋Šฅ: ๋น„๋™๊ธฐ ์–ดํŽœ๋”๊ฐ€ ๋กœ๊น…๊ณผ ์ฒ˜๋ฆฌ๋ฅผ ๋ถ„๋ฆฌํ•ฉ๋‹ˆ๋‹ค

โœ… ์•ˆ์ „: ๋ฏผ๊ฐ ๋ฐ์ดํ„ฐ ๋งˆ์Šคํ‚น์œผ๋กœ GDPR ์ค€์ˆ˜๋ฅผ ๋ณด์žฅํ•ฉ๋‹ˆ๋‹ค

โœ… ํ†ตํ•ฉ: ELK Stack, Datadog, Splunk์™€ ๋„ค์ดํ‹ฐ๋ธŒ ํ˜ธํ™˜์„ฑ

โœ… ์•Œ๋ฆผ ๊ฐ€๋Šฅ: ๊ตฌ์กฐํ™”๋œ ํ•„๋“œ๋กœ ์ •ํ™•ํ•œ ์•Œ๋ฆผ ๊ทœ์น™์„ ๋งŒ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค

โœ… ์œ ์ง€๋ณด์ˆ˜ ์šฉ์ด: JSON ํ˜•์‹์ด ๊นจ์ง€๊ธฐ ์‰ฌ์šด ํŒŒ์‹ฑ ์ •๊ทœ์‹์„ ์ œ๊ฑฐํ•ฉ๋‹ˆ๋‹ค

์ด ์ ‘๊ทผ ๋ฐฉ์‹์€ ๋ฉ”ํŠธ๋ฆญ(Micrometer)๊ณผ ๋ถ„์‚ฐ ์ถ”์ (OpenTelemetry)๊ณผ ํ•จ๊ป˜ ํ˜„๋Œ€ ์˜ต์ €๋ฒ„๋นŒ๋ฆฌํ‹ฐ์˜ ํ† ๋Œ€๋ฅผ ํ˜•์„ฑํ•ฉ๋‹ˆ๋‹ค.

์—ฐ์Šต์„ ์‹œ์ž‘ํ•˜์„ธ์š”!

๋ฉด์ ‘ ์‹œ๋ฎฌ๋ ˆ์ดํ„ฐ์™€ ๊ธฐ์ˆ  ํ…Œ์ŠคํŠธ๋กœ ์ง€์‹์„ ํ…Œ์ŠคํŠธํ•˜์„ธ์š”.

ํƒœ๊ทธ

#spring boot logging
#logback json
#structured logs
#elk stack
#observability

๊ณต์œ 

๊ด€๋ จ ๊ธฐ์‚ฌ

Micrometer์™€ Prometheus๋กœ ๊ตฌํ˜„ํ•˜๋Š” Spring Boot Actuator ๋ชจ๋‹ˆํ„ฐ๋ง

Spring Boot Actuator: Micrometer์™€ Prometheus๋กœ ๊ตฌํ˜„ํ•˜๋Š” ์šด์˜ ๋ชจ๋‹ˆํ„ฐ๋ง

์šด์˜ ๋ชจ๋‹ˆํ„ฐ๋ง์„ ์œ„ํ•œ Spring Boot Actuator ์™„์ „ ๊ฐ€์ด๋“œ์ž…๋‹ˆ๋‹ค. Micrometer ์„ค์ •, Prometheus ์ง€ํ‘œ, ์ปค์Šคํ…€ ์—”๋“œํฌ์ธํŠธ, ์•Œ๋ฆผ ๊ตฌ์„ฑ์„ ๋‹ค๋ฃน๋‹ˆ๋‹ค.

Spring Kafka์™€ ํšŒ๋ณตํƒ„๋ ฅ์„ฑ์„ ๊ฐ–์ถ˜ ์ปจ์Šˆ๋จธ๋กœ ๊ตฌ์ถ•ํ•œ ์ด๋ฒคํŠธ ๊ธฐ๋ฐ˜ ์•„ํ‚คํ…์ฒ˜

Spring Kafka: ํšŒ๋ณตํƒ„๋ ฅ์„ฑ์„ ๊ฐ–์ถ˜ ์ปจ์Šˆ๋จธ๋กœ ๊ตฌ์ถ•ํ•˜๋Š” ์ด๋ฒคํŠธ ๊ธฐ๋ฐ˜ ์•„ํ‚คํ…์ฒ˜

์ด๋ฒคํŠธ ๊ธฐ๋ฐ˜ ์•„ํ‚คํ…์ฒ˜๋ฅผ ์œ„ํ•œ ์™„์ „ํ•œ Spring Kafka ๊ฐ€์ด๋“œ. ์„ค์ •, ํšŒ๋ณตํƒ„๋ ฅ์„ฑ์„ ๊ฐ–์ถ˜ ์ปจ์Šˆ๋จธ, ์žฌ์‹œ๋„ ์ •์ฑ…, Dead Letter Queue, ๋ถ„์‚ฐ ์• ํ”Œ๋ฆฌ์ผ€์ด์…˜์„ ์œ„ํ•œ ์šด์˜ ํŒจํ„ด.

Resolver ๋ฐ DataLoader๊ฐ€ ์žˆ๋Š” Spring GraphQL ๊ธฐ์ˆ  ๋ฉด์ ‘

Spring GraphQL ๋ฉด์ ‘: Resolver, DataLoader ๋ฐ N+1 ๋ฌธ์ œ ํ•ด๊ฒฐ์ฑ…

์ด ์™„์ „ํ•œ ๊ฐ€์ด๋“œ๋กœ Spring GraphQL ๋ฉด์ ‘์„ ์ค€๋น„ํ•ฉ๋‹ˆ๋‹ค. Resolver, DataLoader, N+1 ๋ฌธ์ œ ์ฒ˜๋ฆฌ, mutation ๋ฐ ๊ธฐ์ˆ  ์งˆ๋ฌธ์„ ์œ„ํ•œ ๋ชจ๋ฒ” ์‚ฌ๋ก€๋ฅผ ๋‹ค๋ฃน๋‹ˆ๋‹ค.