Relationships in Mixpeek enable you to define and discover meaningful connections between video segments and other entities. These connections can represent similarity, temporal sequences, demonstrations of concepts, and more.

Relationships is only available for enterprise customers, email info@mixpeek.com for a demo.

Core Relationship Types

Feature-to-Feature

Connect video segments:

  • similar_to: Visual/semantic similarity
  • precedes/follows: Temporal sequence
  • variant_of: Different versions
  • part_of: Segment containment

Feature-to-Node

Link to taxonomy concepts:

  • demonstrates: Shows example
  • references: Contains/mentions
  • violates: Shows incorrect example
  • explains: Provides instruction

Feature-to-Cluster

Connect to discovered groups:

  • belongs_to: Cluster membership
  • near: Proximity relationship

Implementation

POST /entities/relationships
{
  "from": "feat_123",      // Source video segment
  "to": "feat_456",        // Target entity
  "type": "similar_to",    // Relationship type
  "score": 0.88,          // Optional confidence score
  "metadata": {
    "vector_index": "multimodal",
    "timestamp_range": {
      "start": 10.5,
      "end": 15.2
    }
  }
}

Graph Response Structure

When querying the relationship graph, the API returns a network of connected entities and their relationships:

{
  "nodes": [
    {
      "id": "feat_123",
      "type": "feature",
      "metadata": {
        "title": "Tennis Serve",
        "duration": 15.5,
        "timestamp": 45.2
      }
    },
    {
      "id": "node_abc",
      "type": "node",
      "metadata": {
        "name": "Serve Technique",
        "taxonomy": "sports_training"
      }
    },
    {
      "id": "clu_xyz",
      "type": "cluster",
      "metadata": {
        "name": "Pro Serves",
        "size": 45
      }
    }
  ],
  "edges": [
    {
      "from": "feat_123",
      "to": "node_abc",
      "type": "demonstrates",
      "score": 0.92,
      "metadata": {
        "timestamp_range": {
          "start": 10.5,
          "end": 15.2
        }
      }
    },
    {
      "from": "feat_123",
      "to": "clu_xyz",
      "type": "belongs_to",
      "score": 0.88
    }
  ],
  "metadata": {
    "depth": 2,
    "total_nodes": 3,
    "total_edges": 2,
    "query_time_ms": 45
  }
}

The graph response includes:

  • Nodes: All entities in the graph with their metadata
  • Edges: All relationships between nodes with scores and metadata
  • Metadata: Information about the graph query itself

This structure enables visualization and analysis of relationship networks between features, nodes, and clusters.

Internal Relationship Structure

Features store relationships in a simplified array structure:

{
  "relationships": [
    {
      "from": "feat_123",      // Current feature
      "to": "feat_456",        // Related feature
      "type": "similar_to",
      "score": 0.88,
      "metadata": {
        "vector_index": "multimodal"
      }
    },
    {
      "from": "feat_123",
      "to": "node_abc123",
      "type": "demonstrates",
      "score": 0.92
    }
  ]
}

Best Practices for Video Relationships

1

Relationship Selection

  • Choose appropriate relationship types
  • Consider temporal context
  • Use meaningful scores
  • Include relevant metadata
2

Score Generation

  • Vector similarity for “similar_to”
  • Temporal proximity for sequences
  • Model confidence for demonstrations
  • Distance metrics for clusters
3

Performance Optimization

  • Index frequently queried relationships
  • Batch relationship creation
  • Cache common graph queries
  • Monitor relationship density

Consider these limitations when creating relationships:

  • Maximum relationships per feature: 1000
  • Maximum graph query depth: 3
  • Rate limits apply to relationship operations

Combine relationships with clusters and taxonomies for the most effective content organization. Each provides different insights into your video content.