Skip to contentSkip to navigationSkip to topbar
Page toolsOn this page
Looking for more inspiration?Visit the

Memory observability traceability and lineage


Conversation Memory propagates structured trace identifiers across every stage of the bulk profile import pipeline. Use these identifiers to monitor import jobs, group related events, and trace issues to individual profiles.


Overview

overview page anchor

When you submit a bulk profile import, Conversation Memory emits product events and attaches a trace identifier to each event. These identifiers remain consistent across all related events, so you can group, filter, and follow the full lineage of an import—from initial submission through individual profile creation or update.


A bulk profile import produces three layers of observable events:

  1. Import creation: One event when the import job is submitted
  2. Batch submission: One event per batch of up to 1,000 rows
  3. Profile events: Individual events for each profile processed within a batch

Each layer carries a trace identifier that links it to the others. You can therefore move between the overall job view and a single profile without losing context.


IdentifierDescriptionScope
import_idGenerated when the bulk profile import job is submittedShared across all events for the job — track the overall import from submission to completion
request_idAssigned to the processing of a specific batchShared across all operations for that batch — trace the end-to-end flow of a single batch
profile_idUniquely identifies a single profile in the importShared across all operations for that profile — verify or investigate the outcome for a specific record

Import creation

import-creation page anchor

When a bulk profile import is submitted, Memory emits the following event:

com.twilio.memory.profiles.created-import

The payload includes:

  • import_id: The unique identifier for this import job
  • file_name: The source file associated with the import

For each batch of up to 1,000 rows, Memory emits:

com.twilio.memory.profiles.submitted-import

The payload includes:

  • request_id: The identifier for tracking this specific batch

Details of the submitted batch also appear in the properties of the parent com.twilio.memory.profiles.created-import event.

Each profile processed within a batch produces its own event. Filter by request_id to return all profile events for that batch and view which profiles succeeded, failed, or require attention.


Monitoring and debugging

monitoring-and-debugging page anchor

Monitor overall job status

monitor-overall-job-status page anchor

In the Twilio Console Debugger, filter logs by import_id to view all events associated with the import job.

Debug batch-level failures

debug-batch-level-failures page anchor

Filter logs by request_id to surface processing details for a specific batch. This view helps identify bottlenecks or errors that affect every profile in the batch.

Trace individual profile outcomes

trace-individual-profile-outcomes page anchor

Filter logs by profile_id to isolate a single profile and confirm successful creation or update, or to investigate failures.

Identify errors and warnings

identify-errors-and-warnings page anchor

To focus on failures, filter logs where log_level is error or warning. Each profile event includes:

  • error_code: Mapped to the Twilio error catalog
  • error_message: A human-readable description

Combine these fields with the Twilio API error reference to determine likely root causes and next steps.


GoalIdentifier to search for
All events for a bulk import jobimport_id
All events for a specific batchrequest_id
All events for a specific profileprofile_id

Using AI tools for troubleshooting

using-ai-tools-for-troubleshooting page anchor

To use third-party AI coding assistants, provide Conversation Memory documentation and exported log data for pattern analysis and remediation suggestions. Omit or redact sensitive information before sharing data with external services.