2026๋ Spring Boot ๋ก๊น : Logback๊ณผ JSON์ผ๋ก ๊ตฌํํ๋ ์ด์ ํ๊ฒฝ ๊ตฌ์กฐํ ๋ก๊ทธ
Spring Boot ๊ตฌ์กฐํ ๋ก๊น ์๋ฒฝ ๊ฐ์ด๋์ ๋๋ค. Logback JSON ์ค์ , ์ถ์ ์ฉ MDC, ์ด์ ํ๊ฒฝ ๋ชจ๋ฒ ์ฌ๋ก, ELK Stack ์ฐ๋์ ๋ค๋ฃน๋๋ค.

์ ํต์ ์ธ ํ ์คํธ ๋ก๊ทธ๋ ์ด์ ํ๊ฒฝ์์ ๋น ๋ฅด๊ฒ ๊ด๋ฆฌ ๋ถ๊ฐ๋ฅํ ์ํ๊ฐ ๋ฉ๋๋ค. ์๋ฐฑ ๊ฐ์ ์ธ์คํด์ค๊ฐ ์ด๋น ์์ฒ ์ค์ ๋ก๊ทธ๋ฅผ ์์ฑํ๋ฉด ํน์ ์ค๋ฅ๋ฅผ ์ฐพ๋ ์ผ์ ์ ๋ชฝ์ด ๋ฉ๋๋ค. JSON ํ์์ ๊ตฌ์กฐํ ๋ก๊ทธ๋ ๋ชจ๋ ์ด๋ฒคํธ๋ฅผ ์ฟผ๋ฆฌ ๊ฐ๋ฅํ๊ณ ์๋์ผ๋ก ๋ถ์ ๊ฐ๋ฅํ๊ฒ ๋ง๋ค์ด ์ด๋ฌํ ์ํฉ์ ์์ ํ ๋ฐ๊ฟ๋๋ค.
Spring Boot 3.4 ์ด์์ ์ธ๋ถ ์์กด์ฑ ์์ด ๊ตฌ์กฐํ๋ JSON ๋ก๊น ์ ๋ค์ดํฐ๋ธ๋ก ์ง์ํฉ๋๋ค. ์ด์ ๋ฒ์ ์์๋ Logback Logstash Encoder๊ฐ ์ฌ์ ํ ํ์ค ์๋ฃจ์ ์ ๋๋ค.
๊ตฌ์กฐํ ๋ก๊ทธ๋ฅผ ๋์ ํ๋ ์ด์
์ ํต์ ์ธ ํ ์คํธ ๋ก๊ทธ์ ํ๊ณ
์ ํ์ ์ธ ํ ์คํธ ๋ก๊ทธ๋ ๋ค์๊ณผ ๊ฐ์ ๋ชจ์ต์ ๋๋ค.
2026-03-27 10:15:32.456 INFO [order-service,abc123] c.e.s.OrderService - Order created for user john@example.com, amount: 150.00โฌ, items: 3์ด ํ์์ ์ด์ ํ๊ฒฝ์์ ์ฌ๋ฌ ๋ฌธ์ ๋ฅผ ๋ฐ์์ํต๋๋ค. ํน์ ์ ๋ณด๋ฅผ ์ถ์ถํ๋ ค๋ฉด ๋ณต์กํ๊ณ ๊นจ์ง๊ธฐ ์ฌ์ด ์ ๊ท ํํ์์ด ํ์ํฉ๋๋ค. ์๋น์ค ๊ฐ ์๊ด๊ด๊ณ ๋ถ์์๋ ์๊ฒฉํ ๊ท์ฝ์ด ํ์ํ๋ฐ ํ๋ง๋ค ํด์์ด ๋ฌ๋ผ์ง๋๋ค. Elasticsearch ๊ฐ์ ๋ถ์ ๋๊ตฌ๋ ์ด๋ฌํ ๋น๊ตฌ์กฐํ ๋ฌธ์์ด์ ํจ์จ์ ์ผ๋ก ์ธ๋ฑ์ฑํ๊ธฐ ์ด๋ ต์ต๋๋ค.
JSON ํ์์ ์ฅ์
๊ฐ์ ์ด๋ฒคํธ๋ฅผ JSON์ผ๋ก ํํํ๋ฉด ์ฆ์ ํ์ฉํ ์ ์์ต๋๋ค.
{
"@timestamp": "2026-03-27T10:15:32.456Z",
"level": "INFO",
"logger": "com.example.service.OrderService",
"message": "Order created",
"service": "order-service",
"traceId": "abc123",
"userId": "john@example.com",
"orderId": "ORD-789456",
"amount": 150.00,
"currency": "EUR",
"itemCount": 3
}๋ชจ๋ ํ๋๋ฅผ ํํฐ๋งํ๊ณ ์ง๊ณํ ์ ์๊ฒ ๋ฉ๋๋ค. Elasticsearch ์ฟผ๋ฆฌ๋ก ์ต๊ทผ 15๋ถ ๋์ 100์ ๋ก๋ฅผ ์ด๊ณผํ๋ ์ฃผ๋ฌธ ์ ์ฒด๋ฅผ ์ฆ์ ์ฐพ์ ์ ์์ต๋๋ค. Kibana ๋์๋ณด๋๋ ์๋ ํ์ฑ ์์ด ํธ๋ ๋๋ฅผ ์๊ฐํํฉ๋๋ค.
Spring Boot 3.4 ์ด์์ ๋ค์ดํฐ๋ธ ์ค์
๊ตฌ์กฐํ JSON ๋ก๊ทธ ํ์ฑํ
Spring Boot 3.4๋ logging.structured ์์ฑ์ ํตํด ๊ตฌ์กฐํ ๋ก๊น
์ ๋ํ ๋ค์ดํฐ๋ธ ์ง์์ ๋์
ํ์ต๋๋ค. ์ด ๋ฐฉ์์ ์ถ๊ฐ ์์กด์ฑ์ ์ ํ ํ์๋ก ํ์ง ์์ต๋๋ค.
# application.yml
# Native structured logging configuration for Spring Boot 3.4+
logging:
structured:
# Output format: ecs (Elastic), logstash, gelf
format:
console: ecs
file: ecs
file:
name: /var/log/app/application.log
level:
root: INFO
com.example: DEBUGECS(Elastic Common Schema) ํ์์ ์ถ๊ฐ ์ค์ ์์ด Elasticsearch์ Kibana์์ ์ง์ ์ ์ธ ํธํ์ฑ์ ๋ณด์ฅํฉ๋๋ค.
JSON ํ๋ ์ปค์คํฐ๋ง์ด์ง
๊ฐ ๋ก๊ทธ์ ๋น์ฆ๋์ค ํ๋๋ฅผ ์ถ๊ฐํ๊ธฐ ์ํด Spring Boot๋ ์ถ๊ฐ ์์ฑ ์ค์ ์ ํ์ฉํฉ๋๋ค.
# application.yml
# Custom fields in structured logs
logging:
structured:
format:
console: ecs
ecs:
# Service information added to every log
service:
name: ${spring.application.name}
version: ${app.version:1.0.0}
environment: ${spring.profiles.active:default}
node-name: ${HOSTNAME:unknown}// Programmatic configuration for additional fields
package com.example.logging.config;
import org.springframework.boot.logging.structured.StructuredLogFormatterCustomizer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
public class LoggingConfig {
@Bean
StructuredLogFormatterCustomizer<EcsStructuredLogFormatter> ecsCustomizer() {
return formatter -> formatter
// Adds static fields to all logs
.addStaticField("team", "backend")
.addStaticField("region", System.getenv("AWS_REGION"))
// Customizes exception formatting
.setIncludeStacktrace(true)
.setStacktraceMaxLength(5000);
}
}์ด ํ๋๋ค์ ๋ชจ๋ ๋ก๊ทธ ๋ผ์ธ์ ํ์๋์ด ๋์๋ณด๋์์ ํ์ด๋ ๋ฆฌ์ ๋ณ๋ก ํํฐ๋งํ๊ธฐ ์ฝ๊ฒ ๋ง๋ญ๋๋ค.
JSON ์ธ์ฝ๋๋ฅผ ์ฌ์ฉํ ํด๋์ Logback ์ค์
Logstash Encoder ์์กด์ฑ
Spring Boot 3.4 ๋ฏธ๋ง ๋ฒ์ ์ด๊ฑฐ๋ ๊ณ ๊ธ ์ปค์คํฐ๋ง์ด์ง์ด ํ์ํ ๊ฒฝ์ฐ, Logstash Logback Encoder๊ฐ ์ฌ์ ํ ํ์ค ์๋ฃจ์ ์ผ๋ก ๋จ์ ์์ต๋๋ค.
<!-- pom.xml -->
<!-- Dependency for JSON logging with Logback -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>7.4</version>
</dependency>์ ์ฒด Logback ์ค์
logback-spring.xml ํ์ผ์ ์ถ๋ ฅ ํ์์ ๋ํ ์์ ํ ์ ์ด๋ฅผ ์ ๊ณตํฉ๋๋ค.
<!-- src/main/resources/logback-spring.xml -->
<!-- Logback configuration for structured JSON logs -->
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<!-- Spring Boot properties -->
<springProperty scope="context" name="appName" source="spring.application.name" defaultValue="app"/>
<springProperty scope="context" name="appVersion" source="app.version" defaultValue="1.0.0"/>
<!-- JSON console appender for production -->
<appender name="JSON_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<!-- Custom fields added to every log -->
<customFields>{"service":"${appName}","version":"${appVersion}"}</customFields>
<!-- Includes MDC (tracing context) -->
<includeMdcKeyName>traceId</includeMdcKeyName>
<includeMdcKeyName>spanId</includeMdcKeyName>
<includeMdcKeyName>userId</includeMdcKeyName>
<includeMdcKeyName>requestId</includeMdcKeyName>
<!-- ISO8601 timestamp format -->
<timestampPattern>yyyy-MM-dd'T'HH:mm:ss.SSSZ</timestampPattern>
<!-- Complete stack traces -->
<throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
<maxDepthPerThrowable>30</maxDepthPerThrowable>
<maxLength>4096</maxLength>
<shortenedClassNameLength>36</shortenedClassNameLength>
<rootCauseFirst>true</rootCauseFirst>
</throwableConverter>
</encoder>
</appender>
<!-- Rolling JSON file appender -->
<appender name="JSON_FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>/var/log/${appName}/application.json</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>/var/log/${appName}/application.%d{yyyy-MM-dd}.%i.json.gz</fileNamePattern>
<maxHistory>30</maxHistory>
<maxFileSize>100MB</maxFileSize>
<totalSizeCap>3GB</totalSizeCap>
</rollingPolicy>
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"service":"${appName}","version":"${appVersion}"}</customFields>
</encoder>
</appender>
<!-- Text appender for development -->
<appender name="TEXT_CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} %highlight(%-5level) [%thread] %cyan(%logger{36}) - %msg%n</pattern>
</encoder>
</appender>
<!-- Activation by Spring profile -->
<springProfile name="prod,staging">
<root level="INFO">
<appender-ref ref="JSON_CONSOLE"/>
<appender-ref ref="JSON_FILE"/>
</root>
</springProfile>
<springProfile name="dev,local">
<root level="DEBUG">
<appender-ref ref="TEXT_CONSOLE"/>
</root>
</springProfile>
</configuration>์ด ์ค์ ์ ์ด์ ํ๊ฒฝ์์๋ง JSON ๋ก๊ทธ๋ฅผ ํ์ฑํํ๊ณ ๊ฐ๋ฐ ํ๊ฒฝ์์๋ ๊ฐ๋ ์ฑ ์ข์ ๋ก๊ทธ๋ฅผ ์ ์งํฉ๋๋ค.
<springProfile>์ ์ฌ์ฉํ๋ฉด ์ค์ ์ ๋ณ๊ฒฝํ์ง ์๊ณ ๋ ํ๊ฒฝ์ ๋ฐ๋ผ ํ
์คํธ ํ์๊ณผ JSON ํ์ ์ฌ์ด๋ฅผ ์๋์ผ๋ก ์ ํํ ์ ์์ต๋๋ค.
๋ถ์ฐ ์ถ์ ์ ์ํ MDC
์ถ์ ์ปจํ ์คํธ ์ ํ
MDC(Mapped Diagnostic Context)๋ ์์ฒญ์ด๋ ์ถ์ ์๋ณ์ ๊ฐ์ ์ปจํ ์คํธ ์ ๋ณด๋ฅผ ๋ชจ๋ ๋ก๊ทธ์ ์ถ๊ฐํฉ๋๋ค.
// Filter for automatic trace context injection
package com.example.logging.filter;
import jakarta.servlet.FilterChain;
import jakarta.servlet.ServletException;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
import org.slf4j.MDC;
import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;
import org.springframework.stereotype.Component;
import org.springframework.web.filter.OncePerRequestFilter;
import java.io.IOException;
import java.util.UUID;
@Component
@Order(Ordered.HIGHEST_PRECEDENCE)
public class TracingFilter extends OncePerRequestFilter {
// Standard MDC keys for tracing
private static final String TRACE_ID_KEY = "traceId";
private static final String SPAN_ID_KEY = "spanId";
private static final String REQUEST_ID_KEY = "requestId";
private static final String USER_ID_KEY = "userId";
@Override
protected void doFilterInternal(
HttpServletRequest request,
HttpServletResponse response,
FilterChain filterChain) throws ServletException, IOException {
try {
// Retrieve or generate trace identifiers
String traceId = extractOrGenerate(request, "X-Trace-Id", TRACE_ID_KEY);
String spanId = generateSpanId();
String requestId = extractOrGenerate(request, "X-Request-Id", REQUEST_ID_KEY);
String userId = request.getHeader("X-User-Id");
// Inject into MDC to appear in all logs
MDC.put(TRACE_ID_KEY, traceId);
MDC.put(SPAN_ID_KEY, spanId);
MDC.put(REQUEST_ID_KEY, requestId);
if (userId != null) {
MDC.put(USER_ID_KEY, userId);
}
// Propagate to responses for inter-service chaining
response.setHeader("X-Trace-Id", traceId);
response.setHeader("X-Request-Id", requestId);
filterChain.doFilter(request, response);
} finally {
// Clean MDC after each request
MDC.clear();
}
}
private String extractOrGenerate(HttpServletRequest request, String header, String key) {
String value = request.getHeader(header);
return value != null ? value : UUID.randomUUID().toString().replace("-", "").substring(0, 16);
}
private String generateSpanId() {
return UUID.randomUUID().toString().replace("-", "").substring(0, 8);
}
}์์ฒญ ์ฒ๋ฆฌ ๋์ค ๋ฐ์ํ๋ ๋ชจ๋ ๋ก๊ทธ์๋ ์ด ์๋ณ์๋ค์ด ์๋์ผ๋ก ํฌํจ๋ฉ๋๋ค.
๋น์ฆ๋์ค ์ฝ๋์์ MDC ์ฌ์ฉ
// Business service with enriched contextual logging
package com.example.service;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
import org.springframework.stereotype.Service;
@Service
public class OrderService {
private static final Logger log = LoggerFactory.getLogger(OrderService.class);
public Order createOrder(CreateOrderRequest request) {
// Add business information to MDC context
MDC.put("orderId", request.getOrderId());
MDC.put("customerId", request.getCustomerId());
try {
log.info("Creating order with {} items", request.getItems().size());
// Business logic...
Order order = processOrder(request);
log.info("Order created successfully, total: {} {}",
order.getTotal(), order.getCurrency());
return order;
} catch (Exception e) {
// Exception appears with full MDC context
log.error("Failed to create order", e);
throw e;
} finally {
// Clean business keys added
MDC.remove("orderId");
MDC.remove("customerId");
}
}
}์์ฑ๋ JSON ๋ก๊ทธ์๋ ๋๋ฒ๊น ์ ํ์ํ ๋ชจ๋ ์ ๋ณด๊ฐ ํฌํจ๋ฉ๋๋ค.
{
"@timestamp": "2026-03-27T10:15:32.456Z",
"level": "INFO",
"logger": "com.example.service.OrderService",
"message": "Order created successfully, total: 150.00 EUR",
"traceId": "a1b2c3d4e5f67890",
"spanId": "12345678",
"requestId": "req-abc-123",
"userId": "user-456",
"orderId": "ORD-789",
"customerId": "CUST-321"
}Spring Boot ๋ฉด์ ์ค๋น๊ฐ ๋์ จ๋์?
์ธํฐ๋ํฐ๋ธ ์๋ฎฌ๋ ์ดํฐ, flashcards, ๊ธฐ์ ํ ์คํธ๋ก ์ฐ์ตํ์ธ์.
์ฑ๋ฅ์ ์ํ ๋น๋๊ธฐ ๋ก๊น
์ค๋ ๋ ํ ์ค์
์ด์ ํ๊ฒฝ์์ ๋๊ธฐ์ ์ธ ๋ก๊ทธ ์ฐ๊ธฐ๋ ์์ฒญ ์ง์ฐ ์๊ฐ์ ์ํฅ์ ์ค๋๋ค. ๋น๋๊ธฐ ์ดํ๋๋ ๋ฉ์ธ ์ค๋ ๋์ ๋ก๊น ์ ๋ถ๋ฆฌํฉ๋๋ค.
<!-- logback-spring.xml -->
<!-- High-performance asynchronous appender configuration -->
<appender name="ASYNC_JSON" class="ch.qos.logback.classic.AsyncAppender">
<!-- Pending log buffer size -->
<queueSize>1024</queueSize>
<!-- Never block the calling thread -->
<neverBlock>true</neverBlock>
<!-- Threshold before dropping DEBUG/TRACE logs -->
<discardingThreshold>20</discardingThreshold>
<!-- Include caller information (expensive) -->
<includeCallerData>false</includeCallerData>
<!-- Actual appender for writing -->
<appender-ref ref="JSON_FILE"/>
</appender>
<springProfile name="prod">
<root level="INFO">
<appender-ref ref="ASYNC_JSON"/>
</root>
</springProfile>๋ก๊น ์์คํ ๋ฉํธ๋ฆญ
๋ก๊น ์์คํ ์์ฒด๋ฅผ ๋ชจ๋ํฐ๋งํ๋ฉด ์กฐ์ฉํ ๋ก๊ทธ ์์ค์ ๋ฐฉ์งํ ์ ์์ต๋๋ค.
// Exposing Logback metrics via Micrometer
package com.example.logging.metrics;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.Appender;
import ch.qos.logback.classic.AsyncAppender;
import io.micrometer.core.instrument.Gauge;
import io.micrometer.core.instrument.MeterRegistry;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;
import jakarta.annotation.PostConstruct;
import java.util.Iterator;
@Component
public class LoggingMetrics {
private final MeterRegistry registry;
public LoggingMetrics(MeterRegistry registry) {
this.registry = registry;
}
@PostConstruct
void registerMetrics() {
LoggerContext context = (LoggerContext) LoggerFactory.getILoggerFactory();
Logger rootLogger = context.getLogger(Logger.ROOT_LOGGER_NAME);
// Iterate through appenders to find AsyncAppenders
Iterator<Appender<ILoggingEvent>> it = rootLogger.iteratorForAppenders();
while (it.hasNext()) {
Appender<ILoggingEvent> appender = it.next();
if (appender instanceof AsyncAppender asyncAppender) {
registerAsyncMetrics(asyncAppender);
}
}
}
private void registerAsyncMetrics(AsyncAppender appender) {
String appenderName = appender.getName();
// Current queue size
Gauge.builder("logback.async.queue.size", appender, AsyncAppender::getQueueSize)
.tag("appender", appenderName)
.description("Current async appender queue size")
.register(registry);
// Remaining capacity
Gauge.builder("logback.async.queue.remaining", appender, AsyncAppender::getRemainingCapacity)
.tag("appender", appenderName)
.description("Remaining capacity in async queue")
.register(registry);
// Number of dropped logs
Gauge.builder("logback.async.discarded", appender, AsyncAppender::getNumberOfElementsInQueue)
.tag("appender", appenderName)
.description("Number of discarded log events")
.register(registry);
}
}logback.async.queue.remaining < 100์ ๋ํ Prometheus ์๋์ ๋ก๊ทธ ์์ค ์ํ์ ์ฌ์ ์ ์๋ ค์ค๋๋ค.
ELK Stack ์ฐ๋
Filebeat ์ค์
Filebeat๋ JSON ํ์ผ์ ์์งํด ๋ณํ ์์ด Elasticsearch๋ก ์ ์กํฉ๋๋ค.
# filebeat.yml
# Filebeat configuration for Spring Boot JSON logs
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/*/application.json
# Automatic JSON parsing
json:
keys_under_root: true
overwrite_keys: true
add_error_key: true
message_key: message
processors:
# Add Kubernetes metadata if available
- add_kubernetes_metadata:
host: ${NODE_NAME}
matchers:
- logs_path:
logs_path: "/var/log/containers/"
# Parse timestamp
- timestamp:
field: "@timestamp"
layouts:
- '2006-01-02T15:04:05.000Z'
- '2006-01-02T15:04:05.000-07:00'
test:
- '2026-03-27T10:15:32.456Z'
output.elasticsearch:
hosts: ["elasticsearch:9200"]
index: "logs-%{[service]}-%{+yyyy.MM.dd}"
pipeline: "spring-boot-logs"
setup.template:
name: "logs"
pattern: "logs-*"๋ฐ์ดํฐ ๋ณด๊ฐ์ ์ํ Elasticsearch ํ์ดํ๋ผ์ธ
// PUT _ingest/pipeline/spring-boot-logs
{
"description": "Spring Boot logs enrichment",
"processors": [
{
"geoip": {
"field": "client.ip",
"target_field": "client.geo",
"ignore_missing": true
}
},
{
"user_agent": {
"field": "user_agent.original",
"target_field": "user_agent",
"ignore_missing": true
}
},
{
"set": {
"field": "event.ingested",
"value": "{{_ingest.timestamp}}"
}
},
{
"script": {
"description": "Classify log level severity",
"source": """
def level = ctx.level;
if (level == 'ERROR') ctx.severity = 4;
else if (level == 'WARN') ctx.severity = 3;
else if (level == 'INFO') ctx.severity = 2;
else ctx.severity = 1;
"""
}
}
]
}์ด์ ํ๊ฒฝ ๋ชจ๋ฒ ์ฌ๋ก
์ฒด๊ณ์ ์ผ๋ก ํฌํจํด์ผ ํ ์ ๋ณด
๊ฐ ๋ก๊ทธ์๋ ๋๋ฒ๊น ๊ณผ ์๊ด๊ด๊ณ ๋ถ์์ ์ํ ์ต์ ์ ๋ณด๊ฐ ํฌํจ๋์ด์ผ ํฉ๋๋ค.
// Helper for consistent structured logs
package com.example.logging;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
import java.util.Map;
import java.util.function.Supplier;
public final class StructuredLogger {
private final Logger delegate;
private StructuredLogger(Class<?> clazz) {
this.delegate = LoggerFactory.getLogger(clazz);
}
public static StructuredLogger getLogger(Class<?> clazz) {
return new StructuredLogger(clazz);
}
// Log with temporary business context
public void info(String message, Map<String, String> context) {
try {
context.forEach(MDC::put);
delegate.info(message);
} finally {
context.keySet().forEach(MDC::remove);
}
}
// Log with supplier for lazy evaluation
public void debug(Supplier<String> messageSupplier, Map<String, String> context) {
if (delegate.isDebugEnabled()) {
try {
context.forEach(MDC::put);
delegate.debug(messageSupplier.get());
} finally {
context.keySet().forEach(MDC::remove);
}
}
}
// Error log with full context
public void error(String message, Throwable t, Map<String, String> context) {
try {
context.forEach(MDC::put);
delegate.error(message, t);
} finally {
context.keySet().forEach(MDC::remove);
}
}
}// Usage in business code
private static final StructuredLogger log = StructuredLogger.getLogger(PaymentService.class);
public void processPayment(Payment payment) {
log.info("Processing payment", Map.of(
"paymentId", payment.getId(),
"amount", String.valueOf(payment.getAmount()),
"currency", payment.getCurrency(),
"method", payment.getMethod().name()
));
}์ ์ธํด์ผ ํ ๋ฏผ๊ฐ ์ ๋ณด
๋ก๊ทธ์๋ ์ ๋๋ก ๊ฐ์ธ ์ ๋ณด๋ ๋ฏผ๊ฐ ๋ฐ์ดํฐ๋ฅผ ํฌํจํด์๋ ์ ๋ฉ๋๋ค.
// Sensitive data masking filter
package com.example.logging.filter;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.filter.Filter;
import ch.qos.logback.core.spi.FilterReply;
import java.util.regex.Pattern;
public class SensitiveDataFilter extends Filter<ILoggingEvent> {
// Sensitive data patterns to mask
private static final Pattern EMAIL_PATTERN =
Pattern.compile("[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\\.[a-zA-Z]{2,}");
private static final Pattern CREDIT_CARD_PATTERN =
Pattern.compile("\\b\\d{4}[- ]?\\d{4}[- ]?\\d{4}[- ]?\\d{4}\\b");
private static final Pattern PASSWORD_PATTERN =
Pattern.compile("(?i)(password|pwd|secret|token)[\"']?\\s*[:=]\\s*[\"']?[^\\s,}\"']+");
private static final Pattern PHONE_PATTERN =
Pattern.compile("\\+?\\d{1,3}[- ]?\\d{6,14}");
@Override
public FilterReply decide(ILoggingEvent event) {
// Accept all logs but modify the message
// Note: for real masking, use a custom converter
return FilterReply.NEUTRAL;
}
// Utility method to mask data
public static String maskSensitiveData(String input) {
if (input == null) return null;
String result = input;
result = EMAIL_PATTERN.matcher(result).replaceAll("[EMAIL_MASKED]");
result = CREDIT_CARD_PATTERN.matcher(result).replaceAll("[CARD_MASKED]");
result = PASSWORD_PATTERN.matcher(result).replaceAll("$1=[REDACTED]");
result = PHONE_PATTERN.matcher(result).replaceAll("[PHONE_MASKED]");
return result;
}
}๊ฐ์ธ ์ ๋ณด๋ฅผ ํฌํจํ ๋ก๊ทธ๋ GDPR์ด๋ ํ๊ตญ ๊ฐ์ธ์ ๋ณด๋ณดํธ๋ฒ์ ์ ์ฉ์ ๋ฐ์ต๋๋ค. IP ์ฃผ์, ์ด๋ฉ์ผ, ์ฌ์ฉ์ ์๋ณ์์๋ ๋ณด์กด ์ ์ฑ ๊ณผ ํ์ํ ๊ฒฝ์ฐ ๋์ ์ ์ฐจ๊ฐ ์์ด์ผ ํฉ๋๋ค.
์ ์ ํ ๋ก๊ทธ ๋ ๋ฒจ
// Appropriate log level guidelines
package com.example.logging;
public class LogLevelGuidelines {
// ERROR: Failure requiring intervention
// - Unrecoverable exceptions
// - Critical transaction failures
// - External service unavailability
log.error("Payment gateway unreachable after 3 retries", exception);
// WARN: Abnormal but handled situation
// - Retry in progress
// - Performance degradation
// - Resources near limits
log.warn("Database connection pool at 85% capacity");
// INFO: Significant business events
// - Transaction start/end
// - Important state changes
// - Key user actions
log.info("Order {} shipped to customer {}", orderId, customerId);
// DEBUG: Diagnostic information
// - Execution details
// - Important variable values
// - Branching decisions
log.debug("Cache miss for key {}, fetching from database", cacheKey);
// TRACE: Very fine details
// - Method entry/exit
// - Complete object contents
// - Loops and iterations
log.trace("Processing item {} of {}", index, total);
}๋ก๊ทธ ํ ์คํธ์ ๊ฒ์ฆ
JSON ๊ตฌ์กฐ์ ๋ํ ๋จ์ ํ ์คํธ
// Structured log validation tests
package com.example.logging;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.read.ListAppender;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.slf4j.LoggerFactory;
import org.slf4j.MDC;
import static org.assertj.core.api.Assertions.assertThat;
class StructuredLoggingTest {
private ListAppender<ILoggingEvent> listAppender;
private Logger logger;
private ObjectMapper objectMapper;
@BeforeEach
void setUp() {
logger = (Logger) LoggerFactory.getLogger(StructuredLoggingTest.class);
listAppender = new ListAppender<>();
listAppender.start();
logger.addAppender(listAppender);
objectMapper = new ObjectMapper();
}
@Test
void shouldIncludeMdcFieldsInLog() {
// Given
MDC.put("traceId", "test-trace-123");
MDC.put("userId", "user-456");
// When
logger.info("Test message with MDC context");
// Then
ILoggingEvent event = listAppender.list.get(0);
assertThat(event.getMDCPropertyMap())
.containsEntry("traceId", "test-trace-123")
.containsEntry("userId", "user-456");
MDC.clear();
}
@Test
void shouldLogExceptionWithStackTrace() {
// Given
Exception testException = new RuntimeException("Test error");
// When
logger.error("Operation failed", testException);
// Then
ILoggingEvent event = listAppender.list.get(0);
assertThat(event.getThrowableProxy()).isNotNull();
assertThat(event.getThrowableProxy().getMessage()).isEqualTo("Test error");
}
}๊ฒฐ๋ก
๊ตฌ์กฐํ๋ JSON ๋ก๊ทธ๋ Spring Boot ์ ํ๋ฆฌ์ผ์ด์ ์ ์ต์ ๋ฒ๋น๋ฆฌํฐ๋ฅผ ๋ณํ์ํต๋๋ค.
โ ์ฟผ๋ฆฌ ๊ฐ๋ฅ: ๋ชจ๋ ํ๋๋ฅผ Elasticsearch๋ CloudWatch์์ ํํฐ๋งํ ์ ์์ต๋๋ค
โ ์๊ด ๋ถ์ ๊ฐ๋ฅ: MDC๊ฐ ์ถ์ ์๋ณ์๋ฅผ ์๋น์ค ๊ฐ์ ์ ํํฉ๋๋ค
โ ๊ณ ์ฑ๋ฅ: ๋น๋๊ธฐ ์ดํ๋๊ฐ ๋ก๊น ๊ณผ ์ฒ๋ฆฌ๋ฅผ ๋ถ๋ฆฌํฉ๋๋ค
โ ์์ : ๋ฏผ๊ฐ ๋ฐ์ดํฐ ๋ง์คํน์ผ๋ก GDPR ์ค์๋ฅผ ๋ณด์ฅํฉ๋๋ค
โ ํตํฉ: ELK Stack, Datadog, Splunk์ ๋ค์ดํฐ๋ธ ํธํ์ฑ
โ ์๋ฆผ ๊ฐ๋ฅ: ๊ตฌ์กฐํ๋ ํ๋๋ก ์ ํํ ์๋ฆผ ๊ท์น์ ๋ง๋ค ์ ์์ต๋๋ค
โ ์ ์ง๋ณด์ ์ฉ์ด: JSON ํ์์ด ๊นจ์ง๊ธฐ ์ฌ์ด ํ์ฑ ์ ๊ท์์ ์ ๊ฑฐํฉ๋๋ค
์ด ์ ๊ทผ ๋ฐฉ์์ ๋ฉํธ๋ฆญ(Micrometer)๊ณผ ๋ถ์ฐ ์ถ์ (OpenTelemetry)๊ณผ ํจ๊ป ํ๋ ์ต์ ๋ฒ๋น๋ฆฌํฐ์ ํ ๋๋ฅผ ํ์ฑํฉ๋๋ค.
์ฐ์ต์ ์์ํ์ธ์!
๋ฉด์ ์๋ฎฌ๋ ์ดํฐ์ ๊ธฐ์ ํ ์คํธ๋ก ์ง์์ ํ ์คํธํ์ธ์.
ํ๊ทธ
๊ณต์
๊ด๋ จ ๊ธฐ์ฌ

Spring Boot Actuator: Micrometer์ Prometheus๋ก ๊ตฌํํ๋ ์ด์ ๋ชจ๋ํฐ๋ง
์ด์ ๋ชจ๋ํฐ๋ง์ ์ํ Spring Boot Actuator ์์ ๊ฐ์ด๋์ ๋๋ค. Micrometer ์ค์ , Prometheus ์งํ, ์ปค์คํ ์๋ํฌ์ธํธ, ์๋ฆผ ๊ตฌ์ฑ์ ๋ค๋ฃน๋๋ค.

Spring Kafka: ํ๋ณตํ๋ ฅ์ฑ์ ๊ฐ์ถ ์ปจ์๋จธ๋ก ๊ตฌ์ถํ๋ ์ด๋ฒคํธ ๊ธฐ๋ฐ ์ํคํ ์ฒ
์ด๋ฒคํธ ๊ธฐ๋ฐ ์ํคํ ์ฒ๋ฅผ ์ํ ์์ ํ Spring Kafka ๊ฐ์ด๋. ์ค์ , ํ๋ณตํ๋ ฅ์ฑ์ ๊ฐ์ถ ์ปจ์๋จธ, ์ฌ์๋ ์ ์ฑ , Dead Letter Queue, ๋ถ์ฐ ์ ํ๋ฆฌ์ผ์ด์ ์ ์ํ ์ด์ ํจํด.

Spring GraphQL ๋ฉด์ : Resolver, DataLoader ๋ฐ N+1 ๋ฌธ์ ํด๊ฒฐ์ฑ
์ด ์์ ํ ๊ฐ์ด๋๋ก Spring GraphQL ๋ฉด์ ์ ์ค๋นํฉ๋๋ค. Resolver, DataLoader, N+1 ๋ฌธ์ ์ฒ๋ฆฌ, mutation ๋ฐ ๊ธฐ์ ์ง๋ฌธ์ ์ํ ๋ชจ๋ฒ ์ฌ๋ก๋ฅผ ๋ค๋ฃน๋๋ค.