This document is a Compliance Test Plan developed by Digital Cinema Initiatives, LLC (DCI). DCI is the owner of this Compliance Test Plan for the purpose of copyright and other laws in all countries throughout the world. The DCI copyright notice must be included in all reproductions, whether in whole or in part, and may not be deleted or attributed to others. DCI hereby grants to its members and their suppliers a limited license to reproduce this Compliance Test Plan for their own use, provided it is not sold. Others must obtain permission to reproduce this Compliance Test Plan from Digital Cinema Initiatives, LLC.
This Compliance Test Plan is intended solely as a guide for companies interested in developing products that can be compatible with other products developed using this document and [DCI-DCSS] . Each DCI member company shall decide independently the extent to which it will utilize, or require adherence to, this Compliance Test Plan. DCI shall not be liable for any exemplary, incidental, proximate or consequential damages or expenses arising from the use of this document. This document defines only one approach to compatibility, and other approaches may be available to the industry. Only DCI has the right and authority to revise or change the material contained in this document, and any revisions by any party other than DCI are unauthorized and prohibited.
Using this document may require the use of one or more features covered by proprietary rights (such as features which are the subject of a patent, patent application, copyright, mask work right or trade secret right). By publication of this document, no position is taken by DCI with respect to the validity or infringement of any patent or other proprietary right. DCI hereby expressly disclaims any liability for infringement of intellectual property rights of others by virtue of the use of this document. DCI has not and does not investigate any notices or allegations of infringement prompted by publication of any DCI document, nor does DCI undertake a duty to advise users or potential users of DCI documents of such notices or allegations. DCI hereby expressly advises all users or potential users of this document to investigate and analyze any potential infringement situation, seek the advice of intellectual property counsel, and, if indicated, obtain a license under any applicable intellectual property right or take the necessary steps to avoid infringement of any intellectual property right. DCI expressly disclaims any intent to promote infringement of any intellectual property right by virtue of the evolution or publication of this document.
DCI gratefully acknowledges the participation and technical contributions of Sandflow Consulting LLC, San Mateo, CA, https://www.sandflow.com/ , in the preparation of this document.
DCI gratefully acknowledges the participation and technical contributions of CineCert LLC, 2840 N. Lima St, Suite 110A, Burbank, CA 91504 https://www.cinecert.com/ , in the preparation of this document.
DCI gratefully acknowledges the participation and technical contributions of the Fraunhofer Institute for Integrated Circuits, IIS, Am Wolfsmantel 33, 91058 Erlangen, Germany, http://www.iis.fraunhofer.de/ , in the preparation of this document.
Digital Cinema Initiatives, LLC (DCI) is a joint venture of Disney, Fox, Paramount, Sony Pictures Entertainment, Universal, and Warner Bros. Studios. The primary purpose of DCI is to establish uniform specifications for d-cinema. These DCI member companies believe that d-cinema will provide real benefits to theater audiences, theater owners, filmmakers and distributors. DCI was created with the recognition that these benefits could not be fully realized without industry-wide specifications. All parties involved in d-cinema must be confident that their products and services are interoperable and compatible with the products and services of all industry participants. The DCI member companies further believe that d-cinema exhibition will significantly improve the movie-going experience for the public.
Digital cinema is today being used worldwide to show feature motion pictures to thousands of audiences daily, at a level of quality commensurate with (or better than) that of 35mm film release prints. Many of these systems are informed by the Digital Cinema System Specification, Version 1.0, published by DCI in 2005. In areas of image and sound encoding, transport security and network services, today's systems offer practical interoperability and an excellent movie-going experience. These systems were designed, however, using de-facto industry practices.
With the publication of the Digital Cinema System Specification [DCI-DCSS] , and the publication of required standards from SMPTE, ISO, and other bodies, it is possible to design and build d-cinema equipment that meets all DCI requirements. Manufacturers preparing new designs, and theaters planning expensive upgrades are both grappling with the same question: how to know if a d-cinema system is compliant with DCI requirements?
Note: This test plan references standards from SMPTE, ISO, and other bodies that have specific publication dates. The specific version of the referenced document to be used in conjunction with this test plan shall be those listed in Appendix F .
This Compliance Test Plan (CTP) was developed by DCI to provide uniform testing procedures for d-cinema equipment. The CTP details testing procedures, reference files, design evaluation methods, and directed test sequences for content packages and specific types of equipment. These instructions will guide the Test Operator through the testing process and the creation of a standard DCI compliance evaluation report.
This document is presented in three parts and eight appendices.
The procedures in this document are substantially traceable to the many normative references cited throughout. In some cases, DCI have chosen to express a constraint or required behavior directly in this document. In these cases it will not be possible to trace the requirement directly to an external document. Nonetheless, the requirement is made normative for the purpose of DCI compliance testing by its appearance in this document.
This document is written to inform readers from many segments of the motion picture industry, including manufacturers, content producers, distributors, and exhibitors. Readers will have specific needs of this text and the following descriptions will help identify the parts that will be most useful to them. Generally though, the reader should have technical experience with d-cinema systems and access to the required specifications. Some experience with general operating system concepts and installation of source code software will be required to run many of the procedures.
This document uses the following typographical conventions to convey information in its proper context.
A Bold Face style is used to display the names of commands to be run on a computer system.
A
Fixed
Width
font
is
used
to
express
literal
data
such
as
string
values
or
element
names
for
XML
documents,
or
command-line
arguments
and
output.
Examples
that
illustrate
command
input
and
output
are
displayed
in
a
Fixed
Width
font
on
a
shaded
background:
$ echo "Hello, World!"
Hello,
World!
1
Less-than
(
<
)
and
greater-than
(
>
)
symbols
are
used
to
illustrate
generalized
input
values
in
command-line
examples.
They
are
placed
around
the
generalized
input
value,
e.g.
,
<input-value>
.
These
symbols
are
also
used
to
direct
command
output
in
some
command-line
examples,
and
are
also
an
integral
part
of
the
XML
file
format.
Callouts (white numerals on a black background, as in the example above) are used to provide reference points for examples that include explanations. Examples with callouts are followed by a list of descriptions explaining each callout.
Square brackets ([ and ]) are used to denote an external document reference, e.g. , [SMPTE-377-1] .
The test procedures documented in Part I. Procedural Tests will contain the following sub-sections (except as noted)
Note – There may be additional restrictions, depending on implementation. For example, some Media Blocks may refuse to perform even the most basic operations as long as they are not attached to an SMS or Imaging Device. For these environments, additional equipment may be required.
The [DCI-DCSS] allows different system configurations, meaning different ways of grouping functional modules and equipment together. The following diagram shows what is considered to be a typical configuration allowed by DCI.
The left side of the diagram shows the extra-theater part, which deals with DCP and KDM generation and transport. The right side shows the intra-theater part, which shows the individual components of the theater system and how they work together. This test plan will test for proper DCP and KDM formats ( i.e. , conforming to the Digital Cinema System Specification), for proper transport of the data and for proper processing of valid and malformed DCPs and KDMs. In addition, physical system properties and performance will be tested in order to ensure that the system plays back the data as expected and implements all security measures as required by DCI.
While the above diagram shows what is considered to be a typical configuration allowed by the Digital Cinema System Specification, the [DCI-DCSS] still leaves room for different implementations, for example, some manufacturers may choose to integrate the Media Decryptor blocks into the Imaging Device, or share storage between d-cinema servers.
In order to successfully execute one of the test sequences given in Part III. Consolidated Test Procedures , the Test Operator must understand the details of many documents and must have assembled the necessary tools and equipment to execute the tests. This document provides all the necessary references to standards, tutorials and tools to orient the technical reader.
As an example, Section 7.5.12 requires a calculation to be performed on a set of measured and reference values to determine whether a Imaging Device's colorimetry is within tolerance. Section C.6 provides an implementation of this calculation, but the math behind the program and the explanation behind the math are not presented in this document. The Test Operator and system designer must read the reference documents noted in Section 7.5.12 (and any references those documents may make) in order to fully understand the process and create an accurate design or present accurate results on a test report.
Preparing a Test Subject and the required documentation requires the same level of understanding as executing the test. Organizations may even choose to practice executing the test internally in preparation for a test by a Testing Organization. The test procedures have been written to be independent of any proprietary tools. In some cases this policy has led to an inefficient procedure, but the resulting transparency provides a reference measurement that can be used to design new tools, and verify results obtained from any proprietary tools a Testing Organization may use.
Many tests in this Part rely on the Security Manager promptly making available log records of events. In order to provide a bound on test durations, failure of a Security Managers to make the record of an event available as part of a log report within 5 minutes of the event being recorded is cause to fail the test being conducted.
Authentication of devices in d-cinema is accomplished using asymmetric cryptography . Unlike symmetric cryptography, which uses the same key to encrypt and decrypt data, asymmetric cryptography uses a pair of keys that each reverse the other's cryptographic operations: data encrypted with one key in the key pair can only be decrypted by the other key in the key pair. In such a key pair, there is a public key that is distributed freely, and a private key that is closely held and protected. Public keys are not easily distinguished from one another because they don't carry any identifying information (they're just really long random numbers). To address this, public keys are distributed with metadata that describes the person or device that holds the private key, called the subject . This set of metadata and the public key comprise the digital certificate . The standard that defines a digital certificate for d-cinema is [SMPTE-430-2] . It is based on the ITU standard for Public Key Infrastructure, called X.509 , and specifies a number of constraints on the X.509v3 standard, such as the X.509 version that can be used and the size of the RSA keys, among other things.
A digital certificate also contains a signature , created by generating a message digest of the certificate and then encrypting that message digest with a (usually different) private key. The signature is then added to the certificate, and is used to verify that the certificate is authentic. The holder of the (private) key used to sign a certificate (encrypt the message digest) is known as the issuer , and identifying information about the issuer is in the Issuer field of the certificate, linking the issuer to the subject's certificate. Similarly, identifying information about the subject is in the Subject field. In most cases, the issuer and the subject are different. When the issuer and subject are the same, the certificate is known as being self- signed . A self-signed certificate is also self-validating, as its own public key is used to validate its signature. When a self-signed certificate is used to sign other certificates, it becomes the Certificate Authority (CA) for those certificates. The collection of certificates, from the top CA certificate to the last certificate (known as a leaf certificate ) are collectively called the certificate chain .
Certificate authentication is recursive: in order to verify that a certificate is valid you have to decrypt the signature using the public key in the issuer's certificate. Once that signature is validated, if the issuer's certificate is not self signed then the signature validation process continues up the chain until a self-signed (CA) certificate is validated. A certificate is trusted only if its entire chain is valid.
The test procedures in this chapter are organized into two groups: tests that evaluate a certificate's compliance to [SMPTE-430-2] and tests that evaluate the behavior of devices that decode certificates. The Certificate Decoder tests are in this section because they are not specific to any particular type of system. All d-cinema devices that decode certificates must behave in the manner described by these tests.
The testing procedures that follow make use of the openssl cryptographic tools and library. openssl is a well known, free, and open source software package available for a number of hardware platforms and operating systems.
Much of the information in a digital certificate can be viewed in a human-readable format using openssl 's 'text' option. The information presented in the text output can be used to validate a number of certificate requirements, and to validate certificate-related KDM requirements by comparing the values present in the text output to the values in the KDM. The following example illustrates the features of a typical d-cinema leaf certificate:
$ openssl x509 -text -noout -in smpte-430-2-leaf-cert.pem1 Certificate: Data: Version: 3 (0x2)2 Serial Number: 39142 (0x98e6)3 Signature Algorithm: sha256WithRSAEncryption4 Issuer: O=.ca.example.com, OU=.ra-1b.ra-1a.s430-2.ca.example.com, CN=.cc-admin/dnQualifier=0sdCakNi3z6UPCYnogMFITbPMos=5 Validity:6 Not Before: Mar 9 23:29:52 2007 GMT7 Not After : Mar 8 23:29:45 2008 GMT8 Subject: O=.ca.example.com, OU=.cc-admin.ra-1b.ra-1a.s430-2.ca.example.com,9 CN=SM.ws-1/dnQualifier=H/i8HyVmKEZSFoTeYI2UV9aBiq4=10 Subject Public Key Info:11 Public Key Algorithm: rsaEncryption12 RSA Public Key: (2048 bit)13 Modulus (2048 bit):14 [hexadecimal values omitted for brevity] Exponent: 65537 (0x10001)15 X509v3 extensions:16 X509v3 Key Usage:17 Digital Signature, Key Encipherment, Data Encipherment18 X509v3 Basic Constraints: critical19 CA:FALSE X509v3 Subject Key Identifier:20 1F:F8:BC:1F:25:66:28:46:52:16:84:DE:60:8D:94:57:D6:81:8A:AE X509v3 Authority Key Identifier:21 keyid:D2:C7:42:6A:43:62:DF:3E:94:3C:26:27:A2:03:05:21:36:CF:32:8B DirName:/O=.ca.example.com/OU=.ra-1a.s430-2.ca.example.com/ CN=.ra-1b/dnQualifier=3NMh+Nx9WhnbDcXKK1puOjX4lsY= serial:56:CE Signature Algorithm: sha256WithRSAEncryption22 [hexadecimal values omitted for brevity]
Issuer
and
Subject
fields
are
present
inside
the
signed
part
of
the
certificate.
$ openssl x509 -text -noout -inform PEM -in <certificate>A correctly formatted and encoded certificate will be displayed as text output by openssl . An incorrectly formed certificate will cause openssl to display an error. A certificate that causes an error to be displayed by the openssl command is incorrectly formed and shall be cause to fail this test.
The
version
of
the
certificate
and
the
presence
of
the
Issuer
and
Subject
fields
in
the
signed
portion
of
the
certificate
can
be
verified
by
viewing
openssl's
text
output
of
the
certificate.
The
version
number
is
indicated
by
2
in
the
example
certificate,
and
the
issuer
and
subject
fields
are
indicated
by
numbers
5
and
10
,
respectively.
An
x509
version
number
other
than
3,
or
the
absence
of
either
the
Subject
field
or
the
Issuer
field
shall
be
cause
to
fail
this
test.
SignatureAlgorithm
of
the
signature
and
the
SignatureAlgorithm
in
the
signed
portion
of
the
certificate
both
contain
the
value
"sha256WithRSAEncryption"
.
$ openssl x509 -text -noout -in <certificate>The signature algorithm of the certificate is indicated by 4 in the example certificate, and the signature algorithm of the signature is indicated by number 22 of the example certificate.
Verify
that
these
fields
both
contain
the
value
"sha256WithRSAEncryption"
.
If
either
field
contains
a
different
value,
this
shall
be
cause
to
fail
this
test.
SignatureValue
field
is
present
outside
the
signed
part
of
the
certificate
and
contains
an
ASN.1
Bit
String
that
contains
a
PKCS
#1SHA256WithRSA
signature
block.
$ openssl x509 -text -noout -in <certificate>A correct certificate signature will be displayed as colon separated hexadecimal values in the text output by openssl . The signature block, omitted from the example certificate, will be present below the signature algorithm at the bottom of the output below callout number 22 of the example certificate. An incorrect certificate signature will cause openssl to display an error. A certificate that causes openssl to generate errors is cause to fail this test. A signature value other than sha256WithRSAEncryption is cause to fail this test.
Serial
Number
field
is
present
inside
the
signed
part
of
the
certificate
and
that
it
contains
a
nonnegative
integer
that
is
no
longer
than
64
bits
(8
bytes).
$ openssl x509 -text -noout -in <certificate>The serial number field is indicated by 3 in the example certificate. Confirm that the serial number is a non-negative integer that is no longer than 64 bits (8 bytes), and that the parenthetical phrase "neg" is not present. A negative serial number or a number larger than 64 bits shall be cause to fail this test.
Subject
Public
Key
Info
field
is
present
inside
the
signed
part
of
the
certificate
and
that
it
describes
an
RSA
public
key
with
a
modulus
length
of
2048
bits
and
a
public
exponent
of
65537.
$ openssl x509 -text -noout -in <certificate>The Subject Public Key Info is indicated by 11 in the example certificate. The modulus length and the public exponent are indicated by 14 and 15 , respectively.
Verify
that
the
Public
Key
Algorithm
type
is
rsaEncryption
and
RSA
Public
Key
is
(2048
bit)
.
Failure
to
meet
both
requirements
is
cause
to
fail
this
test.
Verify
that
the
Modulus
is
(2048
bit)
and
that
Exponent
is
65537
(0x10001)
.
Any
other
value
for
the
modulus
length
or
the
exponent
shall
be
cause
to
fail
this
test.
The section "RSA Key Format" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Validity
field
is
present
inside
the
signed
part
of
the
certificate
and
contains
timestamps
in
UTC.
Timestamps
with
years
up
to
and
including
2049
must
use
two
digits
(UTCTime)
to
represent
the
year.
Timestamps
with
the
year
2050
or
later
must
use
four
digits
(GeneralizedTime)
to
represent
the
year.
$ openssl x509 -text -noout -in <certificate>The validity field is indicated by callout 6 in the example certificate. Confirm that the field is present and that it contains a "Not Before" value as a UTC timestamp as indicated by 7 of the example certificate and a "Not After" value as a UTC timestamp as indicated by 8 of the example certificate. If the validity field is not present, this shall be cause to fail this test.
Verifying the format of the timestamps as either UTCTime or GeneralizedTime can be accomplished by viewing the ASN.1 sequences of the certificate with openssl . Additionally, by using the grep command to specify a text string to display, in this case, "TIME", the time formats can be quickly identified:
$ openssl asn1parse -in <certificate> |grep TIME 154:d=3 hl=2 l= 13 prim: UTCTIME :070312145212Z 169:d=3 hl=2 l= 13 prim: UTCTIME :270307145212ZConfirm that timestamps up to the year 2049 are in UTCTime format, and that timestamps starting with the year 2050 are in GeneralizedTime format. Timestamps in UTCTime format will be formatted as "YYMMDDhhmmssZ", and Timestamps in GeneralizedTime format will have the year coded as "YYYYMMDDhhmmssZ", where "Y" represents the year, "M" represents the month, "D" represents the day, and "h", "m", "s", and "Z" represent hours, minutes, seconds, and the Universal Coordinated Time zone. A timestamp prior to 2049 that is not in UTC format shall be cause to fail this test. A timestamp starting in 2050 or later that is not in GeneralizedTime format shall be cause to fail this test.
Authority
Key
Identifier
field
is
present
in
the
X509v3
Extensions
section
inside
the
signed
part
of
the
certificate.
Authority
Key
Identifier
field
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The Authority Key Identifier of the certificate is indicated by 21 in the example certificate. Confirm that this field exists. The absence of the
Authority
Key
Identifier
field
shall
be
cause
to
fail
this
test.
Key
Usage
field
is
present
in
the
X509v3
Extensionssection
inside
the
signed
part
of
the
certificate.
For
signer
certificates,
verify
that
only
the
"Certificate
Sign"
(keyCertSign)
flag
is
true,
the
"CRL
Sign"
(cRLSign)
flag
may
optionally
be
present.
For
the
SM
role
leaf
certificate
of
a
dual
certificated
MB,
verify
that
the
"Certificate
Sign"
(keyCertSign)
,
"CRL
Sign"
(cRLSign)
,
and
the
"Digital
Signature"
(digitalSignature)
flags
are
false
or
not
present
and
that
the
"Key
Encipherment"
(keyEncipherment)
flag
is
true.
For
the
LS
role
leaf
certificate
of
a
dual
certificated
MB,
verify
that
the
"Certificate
Sign"
(keyCertSign)
,
"CRL
Sign"
(cRLSign)
,
and
the
"Key
Encipherment"
(keyEncipherment)
flags
are
false
or
not
present,
and
that
the
"Digital
Signature"
(digitalSignature)
flag
is
true.
For
all
leaf
certificates
not
part
of
a
dual
certificated
MB,
verify
that
the
"Certificate
Sign"
(keyCertSign)
and
"CRL
Sign"
(cRLSign)
flags
are
false
or
not
present,
and
that
the
"Digital
Signature"
(digitalSignature)
,
and
"Key
Encipherment"
(keyEncipherment)
flags
are
true.
Key
Usage
field
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The
Key
Usage
field
in
the
certificate
is
indicated
by
17
in
the
example
certificate.
For
all
certificates,
confirm
that
this
field
exists.
Absence
of
the
Key
Usage
field
shall
be
cause
to
fail
this
test.
For
signing
certificates,
confirm
that
the
key
usage
listed
in
the
usage
list
(indicated
by
18
)
has
only
"Certificate
Sign"
(keyCertSign)
,
the
optional
"CRL
Sign"
(cRLSign)
flag
may
be
present.
Absence
of
the
"Certificate
Sign"
(keyCertSign)
flag,
or
presence
of
any
other
flag
except
for
"CRL
Sign"
(cRLSign)
,
shall
be
cause
to
fail
this
test.
For
the
SM
role
leaf
certificate
of
a
dual
certificated
MB,
confirm
that
the
key
usage
lists
"Key
Encipherment"
(keyEncipherment)
,
and
that
"Digital
Signature"
(digitalSignature)
is
absent.
Absence
of
"Key
Encipherment"
(keyEncipherment)
,
or
presence
of
"Digital
Signature"
(digitalSignature)
,
shall
be
cause
to
fail
this
test.
Presence
of
"Certificate
Sign"
(keyCertSign)
or
"CRL
Sign"
(cRLSign)
shall
be
cause
to
fail
this
test.
For
the
LS
role
leaf
certificate
of
a
dual
certificated
MB,
confirm
that
the
key
usage
lists
"Digital
Signature"
(digitalSignature)
,
and
that
the
"Key
Encipherment"
(keyEncipherment)
is
absent.
Absence
of
"Digital
Signature"
(digitalSignature)
,
or
presence
of
"Key
Encipherment"
(keyEncipherment)
,
shall
be
cause
to
fail
this
test.
Presence
of
"Certificate
Sign"
(keyCertSign)
or
"CRL
Sign"
(cRLSign)
shall
be
cause
to
fail
this
test.
For
all
leaf
certificates
not
part
of
a
dual
certificated
MB,
confirm
that
the
key
usage
lists
"Digital
Signature"
(digitalSignature)
and
"Key
Encipherment"
(keyEncipherment)
.
Absence
of
"Digital
Signature"
(digitalSignature)
and
"Key
Encipherment"
(keyEncipherment)
shall
be
cause
to
fail
this
test.
Presence
of
"Certificate
Sign"
(keyCertSign)
or
"CRL
Sign"
(cRLSign)
shall
be
cause
to
fail
this
test.
Note that leaf certificates may have other key usages specified, and the presence of other usages not specifically referenced here shall not be a reason to fail this test.
Basic
Constraints
field
is
present
in
the
X509v3
Extensions
section
of
the
signed
portion
of
the
certificate.
For
signer
certificates,
verify
that
the
certificate
authority
attribute
is
true
(CA:TRUE)
and
the
PathLenConstraint
value
is
present
and
either
zero
or
positive.
For
leaf
certificates,
verify
that
the
certificate
authority
attribute
is
false
(CA:FALSE)
and
the
PathLenConstraintis
absent
or
zero.
Basic
Constraints
field
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The Basic Constraints field in the certificate is indicated by 19 in the example certificate. For signing certificates, confirm that this field exists, that the certificate authority value is true (CA:TRUE), and that the path length is present and is a positive integer. For leaf certificates, confirm that the certificate authority value is false (CA:FALSE) and that the path length is absent or zero. The absence of the Basic Constraints field shall be cause to fail this test. For signer certificates, the absence of the CA:TRUE value, or a negative or missing Path Length value shall be cause to fail this test. For leaf certificates, the presence of the CA:TRUE value or the presence of a path length greater than zero shall be cause to fail this test.
DnQualifier
present
in
the
Subject
field
and
that
the
DnQualifier
value
is
the
Base64
encoded
thumbprint
of
the
subject
public
key
in
the
certificate.
Also
verify
that
there
is
exactly
one
DnQualifier
present
in
the
Issuer
field
and
that
the
DnQualifier
value
is
the
Base64
encoded
thumbprint
of
the
issuer's
public
key.
DnQualifier
field
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The Subject DnQualifier in the certificate is in the Subject information as indicated by 10 in the example certificate, and the Issuer DnQualifier in the certificate is in the Issuer information as indicated by 5 . Confirm that each of these fields contain only one DnQualifier. Missing DnQualifier values in either of these fields or the presence of more than one DnQualifier in either field shall be cause to fail this test.
The public key DnQualifier must be recalculated to confirm that the DnQualifier value in each of these fields is correct.
The following steps perform this calculation:
$ openssl x509 -pubkey -noout -in <certificate> | openssl base64 -d \ | dd bs=1 skip=24 2>/dev/null | openssl sha1 -binary | openssl base64The resulting value is the calculated DnQualifier of the public key in the input certificate. Confirm that when this calculation is performed on the public key in the subject certificate, the calculated value is equal to the DnQualifier present in the Subject field. Confirm that when this calculation is performed on the public key in the issuer certificate, the calculated value is equal to the DnQualifier present in the Issuer field of the subject certificate. A DnQualifier that does not match the calculated value of the corresponding certificate's public key shall be cause to fail this test.
OrganizationName
field
is
present
in
the
Issuer
and
Subject
fields.
Verify
that
the
two
OrganizationName
values
are
identical.
OrganizationName
in
the
Subject
and
Issuer
fields
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The OrganizationName values are in the Subject and Issuer fields in the certificate as indicated by 5 and 10 in the example certificate. Confirm that the Organization name, the value specified as
"
O=<organization-name>"
,
is
the
same
in
both
fields.
Non-identical
Organizational
name
values
in
the
Subject
and
Issuer
fields
shall
be
cause
to
fail
this
test.
OrganizationUnitName
(OU)
value
is
present
in
the
Issuer
and
Subject
fields.
OrganizationUnitName
in
the
Subject
and
Issuer
fields
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The OrganizationUnitName values are in the Subject and Issuer fields in the certificate as indicated by 5 and 10 in the example certificate. The absence of an
OrganizationUnitName
in
either
the
Subject
or
Issuer
fields
of
the
certificate
shall
be
cause
to
fail
this
test.
CommonName
(CN)
is
present
exactly
once
in
both
the
Subject
and
Issuer
fields.
Also
verify
that
the
CommonName
fields
contain
a
physical
identification
of
the
entity
(
i.e.
,
make,
model,
or
serial
number,
for
devices).
For
leaf
certificates
(
i.e.
,
certificate
authority
is
set
to
False),
verify
that
at
least
one
role
is
specified
and
that
it
is
the
role
expected
for
the
certificate.
CommonName
in
the
Subject
and
Issuer
fields
can
be
verified
by
using
the
openssl
command
to
display
the
certificate
information
as
described
in
Example
2.1
,
e.g.
:
$ openssl x509 -text -noout -in <certificate>The CommonName values are in the Subject and Issuer fields in the certificate as indicated by 5 and 10 in the example certificate. Confirm that the
CommonName
,
the
value
specified
as
"CN=<common-name>"
is
present
only
once
and
that
it
contains
information
that
identifies
the
entity.
For
leaf
certificates,
confirm
that
the
common
name
specifies
at
least
one
role
and
that
it
is
correct
for
the
certificate.
The
absence
of
the
CommonName
value
in
either
the
Subject
or
Issuer
fields
shall
be
cause
to
fail
this
test.
For
leaf
certificates,
the
absence
of
a
role
designation
shall
be
cause
to
fail
this
test.
$ openssl x509 -text -noout -in <certificate>For signer certificates (certificates that have CA:TRUE), of the X.509v3 extensions listed in the certificate, "Basic Constraints" (indicated by 19 ) must be marked critical. "Basic Constraints" may be marked critical for leaf certificates. "Key Usage" and "Authority Key Identifier" (indicated by 17 ) may be marked critical. No other unrecognized X.509v3 extensions may be marked critical. A signer certificate with a "Basic Constraints" section that is not marked critical shall be cause to fail this test. A Certificate that has any X.509v3 extension marked critical other than "Basic Constraints", "Key Usage" or "Authority Key Identifier" shall be cause to fail this test.
-CAfile
option.
$ openssl verify -CAfile caroot.pem caroot.pem caroot.pem: OK $ cp caroot.pem certchain.pem $ openssl verify -CAfile certchain.pem signer.pem signer.pem: OK $ cat signer.pem >> certchain.pem $ openssl verify -CAfile certchain.pem leaf.pem leaf.pem: OKError messages from openssl indicate that a certificate in the chain did not validate, and that the chain is not valid. Error messages that indicate that the certificate chain is not valid shall be cause to fail this test.
Issuer
field,
there
is
a
corresponding
certificate
whose
Subject
field
matches
that
Issuer
field.
A complete certificate chain starts with a leaf certificate and ends with a self-signed (CA root) certificate. Between the leaf certificate and the CA root certificate there should be one or more signer certificates. A leaf certificate is signed by a signer certificate, and the signer certificate is identified by its DnQualifier in the "Issuer" field of the leaf certificate. In a chain of three certificates, the signer certificate is in turn signed by the CA root certificate, which is similarly identified by its DnQualifier in the Issuer field of the signer's certificate. The CA root certificate is self-signed and has its own DnQualifier in both the Subject and Issuer fields.
To verify that the certificate chain is complete, confirm that the certificates corresponding to the Issuer DnQualifiers of each of the certificates is present, as explained in Section 2.1.11: Public Key Thumbprint . A certificate chain that does not contain all of the certificates matching the DnQualifiers specified in the Issuer fields of the certificates means the chain is not complete and shall be cause to fail this test.
The validity period of a certificate can be viewed using the procedure described in Section 2.1.7: Validity Field . Confirm that for each certificate in the chain, the signer certificate's validity period completely contains the validity period of the signed certificate. A certificate that has a validity period that extends beyond the validity period of its signer (either starting before, or ending after, the validity period of its signer) shall be cause to fail this test.
To confirm that the CA root certificate is a valid root certificate:A CA Root certificate that is not self-signed shall be cause to fail this test.
BasicConstraint
field
is
True
,
the
PathLenConstraint
value
is
present
and
is
either
zero
or
positive.
Verify
that
if
the
certificate
authority
attribute
of
the
BasicConstraint
field
is
False,
the
PathLenConstraint
field
is
absent
or
set
to
zero.
OrganizationName
in
the
subject
and
issuer
fields
do
not
match.
OrganizationName
values
in
the
Subject
and
Issuer
fields.
Verify
that
the
operation
fails.
A
successful
operation
using
a
malformed
certificate
is
cause
to
fail
this
test.
sha256WithRSAEncryption
.
65537
.
AuthorityKeyIdentifier
X.509v3
extension.
AuthorityKeyIdentifier
.
Verify
that
the
operation
fails.
A
successful
operation
using
a
certificate
without
the
certificate
signer
present
is
cause
to
fail
this
test.
This chapter contains tests for Key Delivery Messages (KDM). The test procedures in this chapter are organized into three groups: tests that evaluate a KDM's compliance to [SMPTE-430-1] , tests that evaluate a KDM's compliance to [SMPTE-430-3] , and tests that evaluate the behavior of devices that decode KDMs. The KDM Decoder tests are in this section because they are not specific to any particular type of system. All d-cinema devices that decode KDMs must behave in the manner described by these tests.
Before diving in to testing KDM files, we will first introduce XML and provide some examples of KDM documents.
XML is a file metaformat: a file format for creating file formats. Many of the files that comprise a d-cinema composition ( e.g. , a feature or trailer), are expressed in XML. While the various d-cinema file formats represent different concepts within the d-cinema system, the arrangement of data within the files is syntactically similar for those files that use XML. This section will provide an overview of XML as used for d-cinema applications. Readers looking for more detailed technical information are referred to the home of XML at http://www.w3.org .
The main unit of data storage in an XML document is the XML element . XML elements are expressed in a document using tags ; strings of human-readable text enclosed between less-than (<) and greater-than (>) characters. An XML document is an element that is meant to be interpreted as a complete unit. Every XML document consists of a single XML element having zero or more (usually hundreds more) elements inside. XML documents may be stored as files, transmitted over networks, etc. The following example shows a very simple XML element, rendered as a single tag
<Comment/>
By itself, this XML element is a complete, though very uninteresting XML document.
To be more useful, our example element needs some data, or content . XML content may include unstructured text or additional XML elements. Here we have expanded the element to contain some text:
<Comment>The quick brown fox...</Comment>
Notice that when an XML element has content, the content is surrounded by two tags, in this case <Comment> and </Comment>. The former is an opening tag, the latter a closing tag.
We now have some data inside our element. We could help the reader of our example XML document by indicating the language that the text represents (these same characters could of course form words from other languages). The language of the text is metadata : in this case, data about the text. In XML, metadata is stored as sets of key/value pairs, or attributes , inside the opening tags. We will add an attribute to our example element to show some metadata, in this case we are telling the reader that the text is in English:
<Comment language="en">The quick brown fox...</Comment>
The following example shows an actual d-cinema data structure (there is no need to understand the contents of this example as this particular structure is covered in more detail in Section 4.2.1 .):
<?xml version="1.0" encoding="UTF-8" standalone="no" ?> <PackingList xmlns="http://www.smpte-ra.org/schemas/429-8/2007/PKL"> <Id>urn:uuid:59430cd7-882d-48e8-a026-aef4b6253dfc</Id> <AnnotationText>Perfect Movie DCP</AnnotationText> <IssueDate>2007-07-25T18:21:31-00:00</IssueDate> <Issuer>user@host</Issuer> <Creator>Packaging Tools v1.0</Creator> <AssetList> <Asset> <Id>urn:uuid:24d73510-3481-4ae5-b8a5-30d9eeced9c1</Id> <Hash>AXufMKY7NyZcfSXQ9sCZls5dSyE=</Hash> <Size>32239753</Size> <Type>application/mxf</Type> <AnnotationText>includes M&E</AnnotationText> </Asset> </AssetList> </PackingList>
You may have noticed that the basic structure of XML allows the expression of almost unlimited types and formats of information. Before a device (or a person) can read an XML document and decide whether it is semantically correct, it must be possible for the reader to know what the document is expected to contain.
The XML standard dictates some initial requirements for XML documents. The document shown in Example 3.1 above illustrates some of these requirements:
Element3
element
(it
should
close
before
Element2
closes,
not
after).
<Element1> <Element2> <Element3> </Element2> </Element3> </Element1>
A document which meets these requirements is said to be well formed . All XML documents must be well formed. An XML parser (a program that reads XML syntax) will complain if you give it XML that is not well-formed. Well-formedness, however, does not help us understand semantically what's in an XML document. To know the meaning of a particular XML structure, we have to have a description of that structure.
The structure and permitted values in an XML document can be defined using XML Schema. There are other languages for expressing the content model of an XML document, but XML Schema is the standard used by the SMPTE specifications for d-cinema. XML Schema is a language, expressed in XML, which allows the user to define the names of the elements and attributes that can appear in an XML document. An XML Schema can also describe the acceptable contents of and combinations of the XML elements.
Given an XML Schema and an XML document, a validating XML parser will report not only errors in syntax but also errors in the use and contents of the elements defined by the schema. Throughout this document, we will use the schema-check program (see Section C.3 ) to test XML documents. The command takes the instance document and one or more schema documents as arguments
$ schema-check <input-file> smpte-430-3.xsd
If this command returns without errors, the XML document can be said to be both well-formed and valid
Some XML documents are defined using more than one schema. In these cases, you can supply the names of any number of schemas on the command line:
$ schema-check <input-file> smpte-430-3.xsd smpte-430-1.xsd
XML Signature is a standard for creating and verifying digital signatures on XML documents. Digital signatures are used to allow recipients of Composition Playlists, Packing Lists and Key Delivery Messages (KDM) to authenticate the documents; to prove that the documents were signed by the party identified in the document as the document's signer, and that the documents have not been modified or damaged since being signed.
The checksig program (distributed with the XML Security library) can be used to test the signature on an XML document. The program is executed with the name of a file containing a signed XML document:
$ checksig test-kdm.xml Signature verified OK!
The
program
expects
that
the
first
certificate
in
the
<KeyInfo>
element
is
the
signer.
This
has
two
implications:
To
address
the
first
issue,
the
dsig_cert.py
program
(see
Section
C.8
)
can
be
used
to
re-write
the
XML
document
with
the
signer's
certificate
first
in
the
<KeyInfo>
element.
This
is
demonstrated
in
the
following
example:
$ dsig_cert.py test-kdm.xml > tmp.xml $ checksig tmp.xml Signature verified OK!
The second issue is addressed by extracting the certificates from the document's XML Signature data and validating them directly with openssl . This procedure is the subject of the next section.
-----BEGIN
CERTIFICATE-----
followed
by
a
newline.
The
encoded
text
is
followed
by
the
string
-----END
CERTIFICATE-----
.
An
example
of
this
format
can
be
seen
below.
Note
that
the
Printable
Encoding
has
newlines
after
every
64
characters.
-----BEGIN CERTIFICATE----- MIIEdzCCA1+gAwIBAgICNBowDQYJKoZIhvcNAQELBQAwgYQxGTAXBgNVBAoTEC5j YS5jaW5lY2VydC5jb20xLDAqBgNVBAsTIy5yYS0xYi5yYS0xYS5zNDMwLTIuY2Eu Y2luZWNlcnQuY29tMRIwEAYDVQQDEwkuY2MtYWRtaW4xJTAjBgNVBC4THGNwSmxw NDBCM0hqSG9kOG9JWnpsVi9DU0xmND0wIBcNMDcwMTE1MjI0OTQ0WhgPMjAwODAx MTUyMjQ5NDJaMIGLMRkwFwYDVQQKExAuY2EuY2luZWNlcnQuY29tMTUwMwYDVQQL EywuY2MtYWRtaW4ucmEtMWIucmEtMWEuczQzMC0yLmNhLmNpbmVjZXJ0LmNvbTEQ MA4GA1UEAxMHU00ud3MtMTElMCMGA1UELhMcdC8zQ2xNWjdiQWRGUnhnam1TRTFn NGY4NUhNPTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOBejWa3Lg+Y uvTYhCaFy0ET6zH6XrB3rLRrlbeMrrTuUMCX0YSmA7m3ZO1Bd/HQrJxyq6hJmPGu auxwWiF4w+AajBRp4eSiAt8srACcEmUyqGHwPLoaKVEaHXSOY8gJp1kZwqGwoR40 RQusfAb2/L76+RlMUyACoJuR6k4kOBW3bjEE4E76KKR4k5K580d7uFf5G86GhGfU AfXHJXboqzHnxQHaMldKNaSskxWrW8GrX43+2ZZUHM2ZKe0Ps/9g2gCRZ6eYaim4 UF+szH0EUY0Mbx4poqn+SZFrUWtEoWcDM6PSTTgCQVOQ1BtzD1lBQoNQGOJcd73N 9f5MfGioWMkCAwEAAaOB5zCB5DALBgNVHQ8EBAMCBLAwDAYDVR0TAQH/BAIwADAd BgNVHQ4EFgQUt/3ClMZ7bAdFRxgjmSE1g4f85HMwgacGA1UdIwSBnzCBnIAUcpJl p40B3HjHod8oIZzlV/CSLf6hf6R9MHsxGTAXBgNVBAoTEC5jYS5jaW5lY2VydC5j b20xJjAkBgNVBAsTHS5yYS0xYS5zNDMwLTIuY2EuY2luZWNlcnQuY29tMQ8wDQYD VQQDEwYucmEtMWIxJTAjBgNVBC4THEJteVdZV3d0M29FNlJGSTVYdDd3K0hGaEtW Zz2CAwDpzTANBgkqhkiG9w0BAQsFAAOCAQEAowjAFQsyoKto7+WBeF9HuCRpKkxk 6qMgXzgAfJFRk/pi7CjnfjxvWukJq4HWgWHpXsGFf/RTp08naV1UHNe71sDYV2Fb MOSFRi2OrRwZExO9SBKQHLZ7ZdLU+6GIHXKjmp9DiofUNOqvZPQnvwG/CmO84CpG K14ktxtOghczzEiJCk2KISsgOU6NK4cmcFfMjuklTwmD5C6TvaawkvcNJQcldjUw TWbvd+Edf9wkHNvBERR9lbCGWr16C5BVQZtFBJAU++3guL/4Qn4lkeU/gmR6o99S UQ+T344CBSIy06ztiWZiuxoONoXfy12DTSepB+QShmuhsScrfv0Q9bB5hw== -----END CERTIFICATE-----
Within
an
XML
document
signed
using
XML
Signature,
certificates
are
stored
in
<dsig:X509Certificate>
elements.
These
elements
can
be
found
at
the
end
of
the
document,
within
the
</dsig:Signature>
element.
The
encoding
method
for
storing
certificate
data
in
XML
Signature
is
virtually
identical
to
PEM.
The
Base64
encoding
(see
[RFC-2045]
)
uses
the
same
mapping
of
binary
data
to
text
characters,
but
the
line
length
is
not
limited
as
with
PEM.
It is a relatively easy task to use a Text Editor to copy and paste certificate data from an XML document:
-----BEGIN
CERTIFICATE-----
,
then
press
the
Enter
key.
Note
that
the
number
of
'-'
(dash)
characters
on
either
side
of
the
BEGIN
CERTIFICATE
label
is
five
(5).
<dsig:X509Certificate>
element
(but
not
the
element
tags)
from
the
KDM
and
paste
it
into
the
new
editor
window.
The
cursor
should
now
be
positioned
at
the
last
character
of
the
certificate;
press
the
Enter
key.
-----END
CERTIFICATE-----
at
the
end
of
the
new
editor
window
and
press
the
Enter
key.
.pem
suffix.
In
most
cases
the
procedure
given
above
can
be
automated
using
the
dsig_extract.py
program
(see
Section
C.9
).
As
shown
below,
the
-p
option
can
be
used
to
provide
a
prefix
for
the
automatically-generated
filenames.
In
this
example,
the
input
document
contained
four
certificates.
$ dsig_extract.py -p my_prefix_ test-kdm.xml $ ls my_prefix_* my_prefix_1.pem my_prefix_2.pem my_prefix_3.pem my_prefix_4.pem
You can test that the certificate has been correctly extracted by using openssl to view the contents of the certificate file:
$ openssl x509 -text -noout -in <certificate-file.pem>
The output from this command should look similar to Example 2.1
To validate a complete chain of extracted certificates, use the procedure in Section 2.1.16 .
The Key Delivery Message (KDM) is an XML document that contains cryptographic information necessary to reproduce an encrypted composition. A KDM also contains metadata about the cryptographic information, such as the validity period and the associated Composition Playlist (CPL). The format of the KDM file is specified by [SMPTE-430-1] . A KDM is a type of Extra-Theater Message (ETM), as specified by [SMPTE-430-3] .
The following examples show the elements of the KDM that will be examined during the procedures. Each example is followed by a list of descriptive text that describes the various features of the KDM called out in the examples. These features will be referred to from the test procedures.
<?xml version="1.0" encoding="UTF-8" standalone="no"?>1 <DCinemaSecurityMessage xmlns="http://www.smpte-ra.org/schemas/430-3/2006/ETM"2 xmlns:dsig="http://www.w3.org/2000/09/xmldsig#" xmlns:enc="http://www.w3.org/2001/04/xmlenc#"> <AuthenticatedPublic Id="ID_AuthenticatedPublic">3 <MessageId>urn:uuid:b80e668c-a175-4bc7-ae48-d3a19c8fce95</MessageId>4 <MessageType>http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type</MessageType>5 <AnnotationText>Perfect Movie KDM</AnnotationText>6 <IssueDate>2007-07-24T17:42:58-00:00</IssueDate>7 <Signer>8 <dsig:X509IssuerName>dnQualifier=wBz3yptkPxbHI/\+LUUeH5R6rQfI=,CN=.cc-admin-x, OU=.cc-ra-1a.s430-2.ca.example.com,O=.ca.example.com</dsig:X509IssuerName> <dsig:X509SerialNumber>6992</dsig:X509SerialNumber> </Signer> <RequiredExtensions> <KDMRequiredExtensions xmlns="http://www.smpte-ra.org/schemas/430-1/2006/KDM"> <Recipient>9 <X509IssuerSerial> <dsig:X509IssuerName>dnQualifier=wBz3yptkPxbHI/\+LUUeH5R6rQfI=,CN=.cc-admin-x, OU=.cc-ra-1a.s430-2.ca.serverco.com,O=.ca.serverco.com</dsig:X509IssuerName> <dsig:X509SerialNumber>8992</dsig:X509SerialNumber>10 </X509IssuerSerial> <X509SubjectName>dnQualifier=83R40icxCejFRR6Ij6iwdf2faTY=,CN=SM.x_Mastering, OU=.cc-ra-1a.s430-2.ca.example.com,O=.ca.example.com</X509SubjectName>11 </Recipient> <CompositionPlaylistId>12 urn:uuid:20670ba3-d4c7-4539-ac3e-71e874d4d7d1 </CompositionPlaylistId> <ContentTitleText>Perfect Movie</ContentTitleText>13 <ContentKeysNotValidBefore>2007-07-24T17:42:54-00:00</ContentKeysNotValidBefore>14 <ContentKeysNotValidAfter>2007-08-23T17:42:54-00:00</ContentKeysNotValidAfter>15 <AuthorizedDeviceInfo> <DeviceListIdentifier>urn:uuid:d47713b9-cde1-40a9-98fe-22ef172723d0</DeviceListIdentifier> <DeviceList>16 <CertificateThumbprint>jk4Z8haFhqCGAVbClW65jVSOib4=</CertificateThumbprint>17 </DeviceList> </AuthorizedDeviceInfo> <KeyIdList>18 <TypedKeyId> <KeyType scope="http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type">MDIK</KeyType>19 <KeyId>urn:uuid:15e929b3-1d86-40eb-875e-d21c916fdd3e</KeyId>20 </TypedKeyId> <TypedKeyId> <KeyType scope="http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type">MDAK</KeyType> <KeyId>urn:uuid:ca8f7756-8c92-4e84-a8e6-8fab898934f8</KeyId> </TypedKeyId> [remaining key IDs omitted for brevity] </KeyIdList> <ForensicMarkFlagList>21 <ForensicMarkFlag> http://www.smpte-ra.org/430-1/2006/KDM#mrkflg-audio-disable </ForensicMarkFlag> </ForensicMarkFlagList> </KDMRequiredExtensions> </RequiredExtensions> <NonCriticalExtensions/> </AuthenticatedPublic>
<AuthenticatedPrivate Id="ID_AuthenticatedPrivate">1 <enc:EncryptedKey xmlns:enc="http://www.w3.org/2001/04/xmlenc#">2 <enc:EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#rsa-oaep-mgf1p">3 <ds:DigestMethod xmlns:ds="http://www.w3.org/2000/09/xmldsig#" Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" /> </enc:EncryptionMethod> <enc:CipherData> <enc:CipherValue>4 [256 Byte long encrypted cipherdata block omitted] </enc:CipherValue> </enc:CipherData> </enc:EncryptedKey> <enc:EncryptedKey xmlns:enc="http://www.w3.org/2001/04/xmlenc#"> <enc:EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#rsa-oaep-mgf1p"> <ds:DigestMethod xmlns:ds="http://www.w3.org/2000/09/xmldsig#" Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" /> </enc:EncryptionMethod> <enc:CipherData> <enc:CipherValue> [256 Byte long encrypted cipherdata block omitted] </enc:CipherValue> </enc:CipherData> </enc:EncryptedKey> <enc:EncryptedKey xmlns:enc="http://www.w3.org/2001/04/xmlenc#"> <enc:EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#rsa-oaep-mgf1p"> <ds:DigestMethod xmlns:ds="http://www.w3.org/2000/09/xmldsig#" Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" /> </enc:EncryptionMethod> <enc:CipherData> <enc:CipherValue> [ 256 Byte long encrypted cipherdata block omitted] </enc:CipherValue> </enc:CipherData> </enc:EncryptedKey> <enc:EncryptedKey xmlns:enc="http://www.w3.org/2001/04/xmlenc#"> <enc:EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#rsa-oaep-mgf1p"> <ds:DigestMethod xmlns:ds="http://www.w3.org/2000/09/xmldsig#" Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" /> </enc:EncryptionMethod> <enc:CipherData> <enc:CipherValue> [ 256 Byte long encrypted cipherdata block omitted] </enc:CipherValue> </enc:CipherData> </enc:EncryptedKey> [additional EncryptionKey entries omitted] </AuthenticatedPrivate>
<dsig:Signature xmlns:dsig="http://www.w3.org/2000/09/xmldsig#">1 <dsig:SignedInfo> <dsig:CanonicalizationMethod Algorithm="http://www.w3.org/TR/2001/REC-xml-c14n-20010315#WithComments" />2 <dsig:SignatureMethod Algorithm="http://www.w3.org/2001/04/xmldsig-more#rsa-sha256" />3 <dsig:Reference URI="#ID_AuthenticatedPublic">4 <dsig:DigestMethod Algorithm="http://www.w3.org/2001/04/xmlenc#sha256" />5 <dsig:DigestValue>cnn8M41NR4jQF+9GOZiNJTlfl+C/l8lBFljuCuq9lQE=</dsig:DigestValue>6 </dsig:Reference> <dsig:Reference URI="#ID_AuthenticatedPrivate">7 <dsig:DigestMethod Algorithm="http://www.w3.org/2001/04/xmlenc#sha256" /> <dsig:DigestValue>TEW7tPwML2iOkIpK2/4rZbJbKgnnXjAtJwe9OJSe8u4=</dsig:DigestValue> </dsig:Reference> </dsig:SignedInfo> <dsig:SignatureValue>uH41s9odRPXzFz+BF3dJ/myG09cLSE9cLzf2C7f2Fm49P9C53T5RSeEIyqt6p5ll8 zlH2q3ZJRZcZuV5VA7UkIb4z6U4CGUTU51D8lL/anY1glLFddjUiDU/0nmC4uAsH rzwQgzOTZmZd2eLo0N70DBtNhTcJZftKUN2O2ybHZaJ7Q/aBxAiCK3h/fRW/b7zM bcbsD9/VfJFI7VQCOLYwTxq643Exj7sYGKISrjuN+MLAubG50hu74YLOtA/dmGB1 G4VeXkBBR/BEjOEeoxyfFpxbZwkdoI18/Qd1JF32xpE1PlTLrJoRyjrX/6qkm9OJ X9GyFNd8jVxdYNI4s1JCnQ==</dsig:SignatureValue> <dsig:KeyInfo>9 <dsig:X509Data> <dsig:X509IssuerSerial> <dsig:X509IssuerName>dnQualifier=wBz3yptkPxbHI/\+LUUeH5R6rQfI=, CN=.cc-admin-x,OU=.cc-ra-1a.s430-2.ca.example.com,O=.ca.example.com</dsig:X509IssuerName> <dsig:X509SerialNumber>6992</dsig:X509SerialNumber> </dsig:X509IssuerSerial> <dsig:X509Certificate>10 [PEM encoded certificate omitted] </dsig:X509Certificate> </dsig:X509Data> <dsig:X509Data> <dsig:X509IssuerSerial> <dsig:X509IssuerName>dnQualifier=8O8W8oYHlf97Y8n0kdAgMU7/jUU=, CN=.s430-2,OU=.ca.example.com,O=.ca.example.com</dsig:X509IssuerName> <dsig:X509SerialNumber>50966</dsig:X509SerialNumber> </dsig:X509IssuerSerial> <dsig:X509Certificate> [PEM encoded certificate omitted] </dsig:X509Certificate> </dsig:X509Data> <dsig:X509Data> <dsig:X509IssuerSerial> <dsig:X509IssuerName>dnQualifier=8O8W8oYHlf97Y8n0kdAgMU7/jUU=, CN=.s430-2,OU=.ca.example.com,O=.ca.example.com</dsig:X509IssuerName> <dsig:X509SerialNumber>13278513546878383468</dsig:X509SerialNumber> </dsig:X509IssuerSerial> <dsig:X509Certificate> [PEM encoded certificate omitted] </dsig:X509Certificate> </dsig:X509Data> </dsig:KeyInfo> </dsig:Signature></DCinemaSecurityMessage>
Since the KDM carries encrypted data, a tool that can decrypt the encrypted portions of the KDM has been provided in Section C.1 . kdm-decrypt takes two arguments, a KDM and the RSA private key that corresponds to the certificate to which the KDM was targeted, and displays the contents of the encrypted section. Here is an example of kdm-decrypt and the resulting output:
$ kdm-decrypt <kdm-file> <rsa-private-key.pem> CipherDataID: f1dc124460169a0e85bc300642f866ab1 SignerThumbprint: q5Oqr6GkfG6W2HzcBTee5m0Qjzw=2 CPL Id: 119d8990-2e55-4114-80a2-e53f3403118d3 Key Id: b6276c4b-b832-4984-aab6-250c9e4f91384 Key Type: MDIK5 Not Before: 2007-09-20T03:24:53-00:006 Not After: 2007-10-20T03:24:53-00:007 Key Data: 7f2f711f1b4d44b83e1dd1bf90dc7d8c 8
$ schema-check smpte-430-3.xsd <input-file> schema validation successfulIf the KDM is not valid or well formed, the program will report an error. A reported error is cause to fail this test.
<IssueDate>
element
in
the
<AuthenticatedPublic>
area
of
the
KDM.
Validity
section
of
the
certificate
as
indicated
by
6
in
the
example
certificate.
<IssueDate>
;
element
as
shown
in
7
of
Example
3.6
.
Not
Before
and
Not
After
values
of
the
signer
certificate
to
the
date
in
the
<IssueDate>
element
of
the
KDM
and
confirm
that
it
is
within
the
date
range.
An
<IssueDate>
value
outside
the
date
ranges
of
the
certificate
is
cause
to
fail
this
test.
<Signer>
element
of
the
KDM
is
valid.
Algorithm
attribute
of
the
<EncryptionMethod>
for
the
encrypted
key
has
the
value
"http://
www.w3.org/2001/04/xmlenc#rsaoaep-mgf1p"
.
Algorithm
attribute
of
the
<EncryptionMethod>
element
in
the
<AuthenticatedPrivate>
element
for
each
of
the
encrypted
keys,
as
indicated
by
3
in
the
example
KDM,
is
"http://www.w3.org/2001/04/xmlenc#rsaoaep-mgf1p"
.
Any
other
value
in
this
attribute
is
cause
to
fail
this
test.
<AnnotationText>
element
is
in
a
human-readable
language.
If
the
optional
xml:lang
attribute
is
present,
the
language
must
match.
If
the
xml:lang
attribute
is
not
present,
the
language
must
be
English.
<AnnotationText>
element
as
indicated
by
6
in
the
Example
3.6
is
a
human-readable
language.
The
presence
of
non-human-readable
data
or
text
in
a
language
other
than
English
without
that
language's
corresponding
xml:lang
value
is
cause
to
fail
this
test.
<ReferenceList>
element
of
the
<EncryptedKey>
element
is
not
present.
<EncryptedKey>
element,
the
<ReferenceList>
element
is
not
present.
The
presence
of
the
<ReferenceList>
element
indicates
that
the
KDM
is
malformed
and
is
cause
to
fail
this
test.
Algorithm
attribute
of
the
<CanonicalizationMethod>
element
of
the
<SignedInfo>
element
in
the
<Signature>
area
of
the
KDM
is
"http://www.w3.org/TR/2001/RECxml-c14n-20010315#WithComments"
.
Algorithm
attribute
of
the
<CanonicalizationMethod>
of
the
<SignedInfo>
element
of
the
<Signature>
element
is
"http://www.w3.org/TR/2001/REC-xml-c14n-20010315#WithComments"
,
as
shown
in
2
of
Example
3.8
.
Any
other
value
in
this
attribute
is
cause
to
fail
this
test.
<SignedInfo>
element
of
the
<Signature>
area
of
the
KDM
contains
at
least
two
child
<Reference>
elements.
The
value
of
the
URI
attribute
of
each
<Reference>
element
must
correspond
to
the
respective
ID
attribute
of
the
digested
element.
Verify
that
the
URI
attribute
of
one
of
the
<Reference>
element
identifies
the
AuthenticatedPublic
portion
of
the
KDM.
Verify
that
the
URI
attribute
of
one
of
the
<Reference>
;
element
identifies
the
AuthenticatedPrivate
portion
of
the
KDM.
<SignedInfo>
element
of
the
<Signature>
area
of
the
KDM
has
at
least
two
child
<Reference>
elements
as
shown
in
4
and
7
of
Example
3.8
The
presence
of
fewer
than
two
<Reference>
elements
is
cause
to
fail
this
test.
URI
attribute
of
one
of
the
<Reference>
element
matches
the
value
of
the
ID
attribute
of
the
AuthenticatedPublic
element,
as
shown
by
4
in
Example
3.8
and
3
in
Example
3.6
.
The
absence
of
this
association
in
the
KDM
is
cause
to
fail
this
test.
URI
attribute
of
one
of
the
<Reference>
element
matches
the
value
of
the
ID
attribute
of
the
AuthenticatedPrivate
element,
as
shown
by
7
in
Example
3.8
and
1
in
Example
3.7
.
The
absence
of
this
association
in
the
KDM
is
cause
to
fail
this
test.
<SignatureMethod>
element
of
the
<SignedInfo>
element
of
the
<Signature>
area
of
the
KDM
contains
the
URI
value
"http://www.w3.org/2001/04/xmldsig-more#rsa-sha256"
.
<SignatureMethod>
element
of
the
<SignedInfo>
element
of
the
<Signature>
section
of
the
KDM
contains
the
URI
value
"http://www.w3.org/2001/04/xmldsig-more#rsa-sha256"
,
as
shown
in
3
of
Example
3.8
.
Any
other
value
is
cause
to
fail
this
test.
<Reference>
elements
of
the
<SignedInfo>
element
in
the
<Signature>
section
of
the
KDM
do
not
contain
a
Transforms
attribute.
<Reference>
elements
of
the
<SignedInfo>
element
in
the
<Signature>
section
of
the
KDM
do
not
contain
a
Transforms
attribute.
The
presence
of
the
Transforms
attribute
is
cause
to
fail
this
test.
Algorithm
attribute
of
the
<DigestMethod>
element
of
each
of
the
<Reference>
elements
in
the
<SignedInfo>
element
of
the
<Signature>
section
of
the
KDM
is
"
http://
www.w3.org/2001/04/xmlenc#sha256"
.
Algorithm
attribute
of
the
<DigestMethod>
element
of
each
of
the
<Reference>
elements
is
"http://www.w3.org/2001/04/xmlenc#sha256"
,
as
shown
in
5
of
Example
3.8
.
Any
other
value
is
cause
to
fail
this
test.
<Signature>
element
is
properly
encoded,
all
digests
are
properly
formed,
the
<SignatureMethod>
and
<CanonicalizationMethod>
in
the
<SignedInfo>
element
are
correct,
and
the
<Reference>
values
are
correct.
Verify
that
the
signature
is
valid.
$ schema-check <input-file> smpte-430-3.xsd schema validation successfulIf the KDM is not valid or well formed, the program will report an error. A reported error is reason to fail this test.
<MessageType>
element
of
the
KDM
contains
the
string
"http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type"
<MessageType>
element
of
the
KDM
contains
the
string
"http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type"
as
shown
in
5
of
Example
3.6
.
Any
other
value
in
this
element
is
cause
to
fail
this
test.
<SubjectName>
element
of
the
<Recipient>
element
of
the
<KDMRequiredExtensions>
element
in
the
KDM.
<SubjectName>
of
the
<Recipient>
element
as
shown
in
11
.
<ContentAuthenticator>
element
of
the
<KDMRequiredExtensions>
element
of
the
KDM
contains
one
of
the
certificate
thumbprints
of
one
of
the
certificates
in
the
chain
of
the
signer
of
the
CPL.
<ContentAuthenticator>
element
of
the
<KDMRequiredExtensions>
element
of
the
KDM.
If
the
element
is
not
present,
this
test
is
considered
passed
and
the
remaining
procedure
steps
are
not
performed.
$ dc-thumbprint <certificate.pem>
<ContentAuthenticator>
value
matches
one
of
the
thumbprints
of
the
certificate
chain
of
the
signer
certificate.
<ContentAuthenticator>
with
a
value
that
does
not
match
one
of
the
thumbprints
is
cause
to
fail
this
test.
<X509Data>
elements
of
the
<KeyInfo>
elements
in
the
signature
portion
of
the
KDM.
<X509Data>
element
can
be
achieved
by
validating
the
signature.
If
the
validation
is
successful
then
the
certificate
that
signed
the
KDM
is
present.
The
signature
can
be
validated
using
the
dsig_cert.py
and
checksig
commands:
Example:
$ dsig_cert.py <kdm-file.kdm.xml> > tmp.xml $ checksig tmp.xmlA KDM that causes checksig to display errors indicates that the signature did not validate and shall be cause to fail this test.
<TypedKeyId>
element
of
the
<KeyIdList>
element
in
the
<KDMRequiredExtensions>
element
is
well
formed.
Verify
that
the
element
contains
one
of
the
following
values:
MDIK,
MDAK,
MDSK,
FMIK,
or
FMAK
.
$ schema-check <kdm-file.kdm.xml> smpte-430-1.xsd schema validation successfulIf the KDM is not valid or well formed, the program will report an error. A reported error is cause to fail this test.
<TypedKeyId>
element,
and
verify
that
the
element
contains
one
of:
MDIK,
MDAK,
MDSK,
FMIK,
or
FMAK
,
as
shown
in
19
of
Example
3.6
Any
other
value
in
this
element
is
cause
to
fail
this
test.
<ForensicMarkFlagList>
element
contains
a
list
of
one
or
both
of
the
following
two
URIs:
http://www.smpte-ra.org/430-1/2006/KDM#mrkflg-picture-disable
http://www.smpte-ra.org/430-1/2006/KDM#mrkflg-audio-disable
<ForensicMarkFlagList>
element.
The
absence
of
the
element
is
cause
to
pass
this
test
and
the
remainder
of
this
procedure
can
be
skipped.
If
present,
the
element
must
contain
one
or
both
of
the
following
URI
values:
http://www.smpte-ra.org/430-1/2006/KDM#mrkflg-picture-disable
http://www.smpte-ra.org/430-1/2006/KDM#mrkflg-audio-disable
<EncryptedData>
is
not
present.
<EncryptedData>
element
is
not
present.
The
presence
of
the
element
is
cause
to
fail
this
test.
<KeyInfo>
element
of
all
<EncryptedKey>
elements
in
the
<AuthenticatedPrivate>
section
of
the
KDM
are
identical.
<KeyInfo>
values
are
identical
in
all
instances
of
<EncryptedKey>
elements.
The
absence
of
<KeyInfo>
elements
is
cause
to
pass
this
test.
The
presence
of
differing
<KeyInfo>
values
in
<EncryptedKey>
elements
is
cause
to
fail
this
test.
<DeviceListDescription>
element
is
in
a
human-readable
language.
If
the
optional
xml:lang
attribute
is
present,
the
language
must
match.
If
the
xml:lang
attribute
is
not
present,
the
language
must
be
English.
Using
a
Text
Editor
,
view
the
KDM
and
confirm
that
the
<DeviceListDescription>
element
is
either
absent
or
is
present
and
contains
human-readable
text.
The
presence
of
non-human-readable
data
or
text
in
a
language
other
than
English
without
that
language's
corresponding
xml:lang
value
is
cause
to
fail
this
test.
<ContentTitleText>
element
is
in
a
human-readable
language.
If
the
optional
xml:lang
attribute
is
present,
the
language
must
match.
If
the
xml:lang
attribute
is
not
present,
the
language
must
be
English.
<ContentTitleText>
element
as
indicated
by
13
in
the
Example
3.6
is
a
human-readable
language.
The
presence
of
non-human-readable
data
or
text
in
a
language
other
than
English
without
that
language's
corresponding
xml:lang
value
is
cause
to
fail
this
test.
scope
attribute
of
the
<TypedKeyId>
element
of
the
<KeyIdList>
element
is
absent
or
contains
the
value
http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type.
<TypedKeyId>
element
is
either
not
present
or
is
present
and
contains
the
value
http://www.smpte-ra.org/430-1/2006/KDM#kdm-key-type
,
as
shown
in
19
of
Example
3.6
Presence
of
the
scope
attribute
with
any
other
value
is
cause
to
fail
this
test.
Algorithm
attribute
of
the
<EncryptionMethod>
element
of
the
<EncryptedKey/>
element
has
the
value
"http://www.w3.org/2001/04/xmlenc#rsa-oaep-mgf1p"
.
Algorithm
attribute
of
the
<EncryptionMethod>
of
the
<EncryptedKey/>
element
contains
the
value
http://www.w3.org/2001/04/xmlenc#rsa-oaepmgf1p
,
as
shown
in
3
of
Example
3.7
.
Presence
of
the
Algorithm
attribute
with
any
other
value
is
cause
to
fail
this
test.
<CompositionPlaylistId>
element
in
the
KDM
matches
the
value
in
the
RSA
protected
<EncryptedKey>
structure,
and
that
these
values
match
the
value
of
the
<Id>
element
in
the
respective
composition
playlist.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Verify that the
<CompositionPlaylistId>
element
of
the
<KDMRequiredExtensions>
element
in
the
plaintext
portion
of
the
KDM
contains
the
same
value
as
the
CPL
ID
present
in
the
RSA
protected
<EncryptedKey>
structure.
Non-identical
values
shall
be
cause
to
fail
this
test.
<ContentKeysNotValidBefore>
and
<ContentKeysNotValidAfter>
elements
match
their
counterparts
in
the
RSA
protected
<EncryptedKey>
structure
and
that
the
values
are
in
UTC
format.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Verify that the
<ContentKeysNotValidBefore>
element
of
the
<KDMRequiredExtensions>
element
has
the
same
value
as
the
corresponding
field
inside
the
RSA
protected
EncryptedKey
structure,
and
that
it
is
in
UTC
format
as
specified
in
[RFC-3339]
.
Non-identical
values
shall
be
cause
to
fail
this
test.
Verify
that
the
<ContentKeysNotValidAfter>
element
of
the
<KDMRequiredExtensions>
element
has
the
same
value
as
the
corresponding
field
inside
the
RSA
protected
EncryptedKey
structure,
and
that
it
is
in
UTC
format
as
specified
in
[RFC-3339]
.
Non-identical
values
shall
be
cause
to
fail
this
test.
<KeyIdList>
element
of
the
<KDMRequiredExtensions>
element
matches
a
KeyID
in
the
RSA
protected
<EncryptedKey>
structure
and
that
there
are
no
KeyIDs
without
corresponding
<EncryptedKey>
structures,
nor
<EncryptedKey>
structures
with
KeyIDs
that
are
not
present
in
the
KeyIDList.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Compare the list of KeyIDs to the KeyIDs in the RSA protected EncryptedKey structures and verify that each of the KeyIDs in the list correspond to a KeyID in an RSA protected EncryptedKey structure. The presence of KeyIDs in the KeyIDList that do not correspond to a KeyID in an RSA protected EncryptedKey structure shall be cause to fail this test. The presence of a KeyID in an RSA protected EncryptedKey structure that is not also present in the KeyIDList shall be cause to fail this test.
<EncryptedKey>
structure
is
f1dc124460169a0e85bc300642f866ab
.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Verify that the plaintext value of the CipherData Structure ID is
f1dc124460169a0e85bc300642f866ab
.
Any
other
value
shall
be
cause
to
fail
this
test.
<EncryptedKey>
element
matches
the
thumbprint
of
the
certificate
that
signed
the
KDM.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>A certificate thumbprint can be calculated using the dc-thumbprint tool included in Section C.1 . Calculate the thumbprint with dc-thumbprint , i.e. ,
$dc-thumbprint <certificate.pem>Identify the certificate used to sign the KDM and calculate its thumbprint. Compare this thumbprint against the thumbprint decrypted from the
<EncryptedKey>
element
and
confirm
that
they
are
the
same.
Non-identical
values
shall
be
cause
to
fail
this
test.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Verify that the plaintext representation of the
<EncryptedKey>
element
contains
two
validity
time
stamps
in
UTC
format.
Time
stamps
that
are
not
present
or
that
are
not
in
UTC
format
shall
be
cause
to
fail
this
test.
<CompositionPlaylistId>
element
in
the
other
portions
of
the
KDM.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>Verify that the decrypted plaintext value of the CompositionPlaylistID the same as the
<CompositionPlaylistId>
element
in
the
AuthenticatedPublic
area
of
the
KDM.
Mismatching
composition
playlist
IDs
shall
be
cause
to
fail
this
test.
<EncryptedKey>
elements
of
the
KDM
use
only
the
allowed
key
types
(
MDIK,
MDAK,
MDSK,
FMIK
and
FMAK
),
and
that
they
match
the
plaintext
fields
in
the
<TypedKeyId>
element
values
for
the
KeyIDs
in
the
<KeyIdList>
element.
$ kdm-decrypt <kdm-file> <rsa-private-key.pem>For each
<EncryptedKey>
element,
verify
that
the
plaintext
representation
contains
a
key
type
that
is
one
of
MDIK,
MDAK,
MDSK,
FMIK
or
FMAK
,
and
that
the
key
type
is
identical
to
the
key
type
for
the
corresponding
KeyID
in
the
KeyIDList.
A
key
type
that
is
not
either
MDIK,
MDAK,
MDSK,
FMIK
or
FMAK
shall
be
cause
to
fail
this
test.
A
key
type
in
the
<EncryptedKey>
element
that
does
not
match
the
key
type
for
the
corresponding
KeyID
in
the
KeyIDList
shall
be
cause
to
fail
this
test.
<X509IssuerName>
element
is
compliant
with
[RFC-2253]
.
<X509IssuerName>
element
as
shown
below
8
of
Example
3.6
.
Verify
that
any
special
characters
are
properly
escaped,
and
the
sequence
is
correct
and
valid.
Improperly
escaped
characters
or
sequences
that
do
not
conform
to
[RFC-2253]
shall
be
cause
to
fail
this
test.
The procedures in this section test the behavior of a KDM decoding device, such as a Security Manager (SM) or a KDM authoring device. The procedures use a generic syntax to instruct the test operator to cause the Test Subject to decode a KDM.
In the case of an SM, the text "Perform an operation..." should be interpreted to mean "Assemble and play a show with DCI 2K StEM (Encrypted) ...".
In the case of a KDM authoring device, the text "Perform an operation..." should be interpreted to mean "Perform a KDM read or ingest operation...".
Some of the procedures in this section require test content that is specifically malformed. In some implementations, these malformations may be caught and reported directly by the SMS without involving the SM. Because the purpose of the procedures is to assure that the SM demonstrates the required behavior, the manufacturer of the Test Subject may need to provide special test programs or special SMS testing modes to allow the malformed content to be applied directly to the SM.
<NonCriticalExtensions>
element
is
present
and
not
empty.
<NonCriticalExtensions>
element
with
child
content.
Verify
that
the
operation
is
successful.
A
failed
operation
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Maximum Number of DCP Keys" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Certificate Presence Check" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
KeyInfo
elements
are
present
in
the
<EncryptedKey>
elements
of
the
<AuthenticatedPrivate>
area
of
the
KDM,
the
Test
Subject
verifies
that
they
all
match,
and
that
the
Test
Subject
rejects
the
KDM
if
they
do
not
match.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
<MessageType>
value.
If
the
operation
succeeds
this
is
cause
to
fail
this
test.
KDMKeysReceived
events
associated
with
the
above
steps
and:
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
KDM
with
invalid
MessageType
.
Verify
that
ReferencedIDs
element
contains
a
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
KDM
with
invalid
MessageType
.
Failure
of
any
verification
shall
be
cause
to
fail
this
test.
contentId
element
contains
the
Id
of
DCI
2K
StEM
(Encrypted)
.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
KDM
with
expired
Signer
certificate
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
KDM
with
expired
Signer
certificate
.
Failure
of
any
verification
shall
be
cause
to
fail
this
test.
KDMFormatError
exception
in
each
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
KDMFormatError
exception
in
any
of
the
associated
KDMKeysReceivedlog
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
KDMKeysReceived
events
associated
with
the
above
steps
and:
contentId
element
contains
the
Id
of
DCI
2K
StEM
(Encrypted)
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
the
KDM
used.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
SignerId
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
the
KDM.
SignatureError
exception
in
each
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
SignatureError
exception
in
any
of
the
associated
KDMKeysReceived
log
records
shall
be
cause
to
fail
this
test.
KDMKeysReceived
event
associated
with
the
above
step
and:
contentId
element
contains
the
Id
of
DCI
2K
StEM
(Encrypted)
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
the
KDM
used.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
CertFormatError
exception
in
the
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CertFormatError
exception
in
the
associated
KDMKeysReceived
log
record
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
<NonCriticalExtensions>
element
is
present
and
not
empty.
<NonCriticalExtensions>
element
with
child
content.
Verify
that
the
operation
is
successful.
A
failed
operation
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
KeyInfo
elements
are
present
in
the
<EncryptedKey>
elements
of
the
<AuthenticatedPrivate>
area
of
the
KDM,
the
OBAE-capable
Test
Subject
verifies
that
they
all
match,
and
that
the
OBAE-capable
Test
Subject
rejects
the
KDM
if
they
do
not
match.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
<MessageType>
value.
If
the
operation
succeeds
this
is
cause
to
fail
this
test.
KDMKeysReceived
events
associated
with
the
above
steps
and:
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
KDM
with
invalid
MessageType
(OBAE)
.
Verify
that
ReferencedIDs
element
contains
a
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
KDM
with
invalid
MessageType
(OBAE)
.
Failure
of
any
verification
shall
be
cause
to
fail
this
test.
contentId
element
contains
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
KDM
with
expired
Signer
certificate
(OBAE)
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
KDM
with
expired
Signer
certificate
(OBAE)
.
Failure
of
any
verification
shall
be
cause
to
fail
this
test.
KDMFormatError
exception
in
each
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
KDMFormatError
exception
in
any
of
the
associated
KDMKeysReceivedlog
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
KDMKeysReceived
events
associated
with
the
above
steps
and:
contentId
element
contains
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
the
KDM
used.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
SignerId
parameter
contains
the
Certificate
Thumbprint
of
the
signing
certificate
of
the
KDM.
SignatureError
exception
in
each
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
SignatureError
exception
in
any
of
the
associated
KDMKeysReceived
log
records
shall
be
cause
to
fail
this
test.
KDMKeysReceived
event
associated
with
the
above
step
and:
contentId
element
contains
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
DCI
2K
StEM
(OBAE)
(Encrypted)
and
KeyDeliveryMessageID
parameter
with
a
value
that
is
the
MessageId
of
the
KDM
used.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
CertFormatError
exception
in
the
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CertFormatError
exception
in
the
associated
KDMKeysReceived
log
record
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
The DCP is the file format for d-cinema content. Entire suites of standards documents from SMPTE define this format, most notably the 428 and 429 multi-part documents. In addition, many IETF documents and some ISO documents are referenced from the SMPTE works. Reading and understanding all of these documents is a substantial task, but it is essential knowledge for accurate and efficient analysis of d-cinema files
In the following procedures, simple tools are used to display the contents of d-cinema files. Example output from these tools is shown with descriptions of the features that will be interesting to the Test Operator. In addition to the tools used in this text, the Test Operator may use more sophisticated methods so long as the results obtained are equivalent to the procedures presented here. The reader should also note that a programmer's Text Editor and a binary viewer or editor are essential tools for direct inspection of data.
D-cinema track files and composition playlists are identified by unique, embedded identifiers. These identifiers, called UUIDs , are defined by [RFC-4122] . d-cinema XML files use UUIDs to refer to other d-cinema XML files and MXF files (assets). When d-cinema assets are written to a filesystem, a mechanism is needed to relate the UUID values to filename values in the filesystem. An Asset Map is an XML document that provides a mapping from UUID values to filesystem paths. When a d-cinema package is written to a volume, an Asset Map is created that includes the size and location of every file in the package 1 .
Along with the Asset Map, each volume has a Volume Index file. The Volume Index file is used to differentiate volumes in a multiple-volume distribution. Both Asset Maps and Volume Indexes are XML files (as described in Section 3.1 ). The formats of the Asset Map file and the Volume Index file are specified in [SMPTE-429-9]
<?xml version="1.0" encoding="UTF-8"?>1 <AssetMap xmlns="http://www.smpte-ra.org/schemas/429-9/2007/AM">2 <Id>urn:uuid:425e93f7-bca2-4255-b8ec-8c7d16fc8881</Id>3 <Creator> Packaging Tools v1.0 </Creator>4 <VolumeCount>1</VolumeCount>5 <IssueDate>2007-07-06T18:25:42-00:00</IssueDate>6 <Issuer>user@host</Issuer>7 <AssetList>8 <Asset>9 <Id>urn:uuid:034b95b0-7424-420f-bbff-a875a79465a5</Id>10 <PackingList>true</PackingList>11 <ChunkList>12 <Chunk>13 <Path>perfect_movie_domestic_51.pkl.xml</Path>14 <VolumeIndex>1</VolumeIndex>15 <Offset>0</Offset>16 <Length>14366</Length>17 </Chunk> </ChunkList> </Asset> <Asset> <Id>urn:uuid:4f89a209-919b-4f21-a1d6-21ad32581115</Id> <ChunkList> <Chunk> <Path>perfect_movie_j2c_r01.mxf</Path> <VolumeIndex>1</VolumeIndex> <Offset>0</Offset> <Length>342162304</Length> </Chunk> </ChunkList> </Asset> <Asset> <Id>urn:uuid:e522f7b6-6731-4df5-a80e-8cfd74f82219</Id> <ChunkList> <Chunk> <Path>perfect_movie_wav_r01.mxf</Path> <VolumeIndex>1</VolumeIndex> <Offset>0</Offset> <Length>34591246</Length> </Chunk> </ChunkList> </Asset> [additional assets omitted for brevity] ... </AssetList> </AssetMap>
<?xml version="1.0" encoding="UTF-8"?>1 <VolumeIndex xmlns="http://www.smpte-ra.org/schemas/429-9/2007/AM">2 <Index>1</Index>3 </VolumeIndex>
ASSETMAP.xml
.
Verify
that
the
Asset
Map
validates
against
the
schema
defined
in
[SMPTE-429-9]
.
ASSETMAP.xml
is
cause
to
fail
this
test.
ASSETMAP.xml
against
the
schema
in
[SMPTE-429-9]
.
Failure
to
correctly
validate
is
cause
to
fail
this
test.
For
more
information
on
schema
validation
see
Section
1.4:
Conventions
and
Practices
E.g.:
$ cd / $ ls -F ASSETMAP.xml PKL_c2434860-7dab-da2b-c39f-5df000eb2335.xml J2K_a13c59ec-f720-1d1f-b78f-9bdea4968c7d_video.mxf WAV_22d190bd-f43b-a420-a12e-2bf29a737521_audio.mxf ... $ $ schema-check ASSETMAP.xml smpte-429-9.xsd schema validation successful $
VOLINDEX.xml
.
Verify
that
the
Volume
Index
file
validates
against
the
schema
defined
in
[SMPTE-429-9]
.
VOLINDEX.xml
is
cause
to
fail
this
test.
VOLINDEX.xml
against
the
schema
in
[SMPTE-429-9]
.
Failure
to
correctly
validate
is
cause
to
fail
this
test.
For
more
information
on
schema
validation
see
Section
1.4:
Conventions
and
Practices
.
$ cd / $ ls -F VOLINDEX.xml PKL_c2434860-7dab-da2b-c39f-5df000eb2335.xml J2K_a13c59ec-f720-1d1f-b78f-9bdea4968c7d_video.mxf WAV_22d190bd-f43b-a420-a12e-2bf29a737521_audio.mxf ... $ $ schema-check VOLINDEX.xml smpte-429-9.xsd schema validation successful $
The Packing List (PKL) is an XML document (see Section 3.1 ) that specifies the contents of a d-cinema Package. It contains the UUID, file type (MXF track file, CPL, etc.), and a message digest of each file in the DCP. This information is used to ensure that all of the expected files have been included and have not been modified or corrupted in transit. The format of the Packing List file is specified by [SMPTE-429-8] .
<?xml version="1.0" encoding="UTF-8" standalone="no"?>1 <PackingList xmlns="http://www.smpte-ra.org/schemas/429-8/2007/PKL">2 <Id>urn:uuid:59430cd7-882d-48e8-a026-aef4b6253dfc</Id>3 <AnnotationText>Perfect Movie DCP</AnnotationText>4 <IssueDate>2007-07-25T18:21:31-00:00</IssueDate>5 <Issuer>user@host</Issuer>6 <Creator>Packaging Tools v1.0</Creator>7 <AssetList>8 <Asset>9 <Id>urn:uuid:24d73510-3481-4ae5-b8a5-30d9eeced9c1</Id>10 <Hash>AXufMKY7NyZcfSXQ9sCZls5dSyE=</Hash>11 <Size>32239753</Size>12 <Type>application/mxf</Type>13 </Asset> <Asset> <Id>urn:uuid:456e547d-af92-4abc-baf3-c4d730bbcd65</Id> <Hash>kAAo0kXYVDBJUphIID89zauv50w=</Hash> <Size>86474446</Size> <Type>application/mxf</Type> </Asset> <Asset> <Id>urn:uuid:e4a4e438-63ec-46cb-b9aa-43acee787d79</Id> <Hash>kt5bP8y4zmHNAY1qVnujItAb4sY=</Hash> <Size>12163</Size> <Type>text/xml</Type> </Asset> <Asset> <Id>urn:uuid:3d445456-54d5-42bc-a7cc-a8c00b20ffb7</Id> <Hash>AQWMKCxxMv001zTS3Y3Oj8M+d9s=</Hash> <Size>62500144</Size> <Type>application/mxf</Type> </Asset> [Remaining assets and signature omitted for brevity] </AssetList> [Signature omitted for brevity] </PackingList>
language
attribute
of
the
<AnnotationText>
element
is
not
present,
or
present
with
a
value
of
"en",
that
the
Annotation
text
is
in
human-readable
English.
$ schema_check.py <input-file> smpte-429-8.xsd schema validation successful $
<AnnotationText>
4
element
is
not
present,
or
present
with
a
value
of
"en",
that
the
contents
of
the
<AnnotationText>
4
element
is
human
readable
English.
Failure
to
meet
this
requirement
is
cause
to
fail
this
test.
$ vi <input-file> ... <AnnotationText>Perfect Movie Reel #1 Picture</AnnotationText> ... <AnnotationText language="en">Perfect Movie Reel #1 Sound</AnnotationText> ... :q $
$ uuid_check.py <input-file> all UUIDs conform to RFC-4122 $
ASSETMAP.xml
file
is
available,
otherwise
the
tester
will
need
to
devise
a
method
for
locating
the
relevant
assets.
For
each
of
the
<Asset>
9
elements
contained
in
the
Packing
List,
compare
the
contents
of
the
child
<Id>
10
element
with
the
contents
of
the
ASSETMAP.xml
file
to
discover
the
path
to
the
asset.
List
the
file
size
of
the
referenced
asset
and
verify
that
it
is
identical
to
the
value
of
the
child
<Size>
12
element
inside
the
<Asset>
9
element.
One
or
more
failures
to
verify
the
file
sizes
is
cause
to
fail
this
test.
$ dsig_cert.py <pkl-file.pkl.xml> > tmp.xml $ checksig tmp.xml The supplied signature is valid $
The Composition Playlist (CPL) is an XML document (see Section 3.1 ) that contains the information necessary to reproduce a composition. It contains metadata about the composition such as the title and the rating, and references to the track files that contain the composition's essence. The format of the Composition Playlist file is specified by [SMPTE-429-7] .
<?xml version="1.0" encoding="UTF-8" standalone="no"?>1 <CompositionPlaylist xmlns="http://www.smpte-ra.org/schemas/429-7/2006/CPL">2 <Id>urn:uuid:20670ba3-d4c7-4539-ac3e-71e874d4d7d1</Id>3 <IssueDate>2007-07-25T00:35:03-00:00</IssueDate>4 <Issuer>user@host</Issuer>5 <Creator> Packaging Tools v1.0 </Creator>6 <ContentTitleText>Perfect Movie</ContentTitleText>7 <ContentKind>feature</ContentKind>8 <ContentVersion>0 <Id>urn:uuid:e5a1b4dc-faf3-461b-a5e2-9d33088b1b28</Id>10 <LabelText>Perfect Movie - Domestic - US 5.1 </LabelText>11 </ContentVersion> <RatingList />12 <ReelList>13 <Reel>14 <Id>urn:uuid:f62cffe9-2da7-4d28-b73e-f21c816ab02f</Id>15 <AssetList>16 <MainPicture>17 <Id>urn:uuid:93270dd0-8675-42fa-9ce8-34b61c963997</Id>18 <EditRate>24 1</EditRate>19 <IntrinsicDuration>480</IntrinsicDuration>20 <EntryPoint>0</EntryPoint>21 <Duration>480</Duration>22 <FrameRate>24 1</FrameRate>23 <ScreenAspectRatio>1998 1080</ScreenAspectRatio>24 </MainPicture>25 <MainSound>26 <Id>urn:uuid:e33b7b37-da90-4429-88af-5c5b63506017</Id> <EditRate>24 1</EditRate> <IntrinsicDuration>2880</IntrinsicDuration> <EntryPoint>120</EntryPoint> <Duration>2760</Duration> </MainSound> </AssetList> </Reel> </ReelList> [Additional reel data and CPL Signature omitted for brevity] </CompositionPlaylist>
<RatingList>
element
contains
at
least
one
instance
of
the
<Rating>
element,
which
in
turn
contains
two
elements,
<Agency>,
that
contains
a
URI
that
represents
the
agency
that
issued
the
rating,
and
<Label>
,
that
contains
the
rating
$ schema-check <input-file> smpte-429-7.xsd schema validation successful $
$ dsig_cert.py <cpl-file.cpl.xml> > tmp.xml $ checksig tmp.xml The supplied signature is valid $
<KeyId>
value
is
listed.
If
an
Asset
Id
occurs
more
than
once
in
the
CPL,
verify
that
the
same
<KeyId>
is
utilized
throughout.
<KeyId>
is
associated
with
only
one
Asset
Id.
<KeyId>
value)
make
a
list
of
all
Asset
Id
values
and
the
associated
<KeyId>
values.
<KeyId>
.
If
Asset
Ids
are
repeated
in
the
CPL,
the
same
<KeyId>
should
be
associated
for
that
Asset
every
time.
Any
deviation
is
cause
to
fail
this
test.
<KeyId>
is
associated
with
exactly
one
Asset
Id
(
i.e.
a
particular
Decryption
Key
should
only
be
associated
with
one,
unique
Asset).
Any
deviation
is
cause
to
fail
this
test.
A Track File is a container for encoded essence. In the d-cinema system, each Track File contains a single track of a single type of essence. For example, a Track File may contain images or sound or timed text, but never more than one type of essence 2 .
D-cinema Track Files are based on the Material eXchange Format (MXF). MXF is a file metaformat, i.e. , a file format for creating file formats. While the various d-cinema Track File formats represent different methods of encoding essence data, the arrangement of metadata within the files is syntactically similar. This section will provide an overview of MXF as used for d-cinema applications. Readers looking for more detailed technical information are referred to [SMPTE-377-1]
Before diving head-first into examining MXF files, it is important to understand the structure of the files. This section will briefly describe the contents of some example MXF files by displaying the files' header metadata using the klvwalk software utility from the free ASDCPLib software package.
Briefly, an MXF file [SMPTE-377-1] contains a sequence of Key-Length-Value (KLV) packets. Some packets carry essence and some carry metadata. MXF files are divided into partitions . Each partition is comprised of a set of KLV packets. The first KLV packet in each partition is a Partition Pack.
The number of partitions in a digital cinema sound or picture Track File is usually three (Timed Text Track Files may have more than three partitions). The first partition in an MXF file contains the metadata which describe the coding parameters of the essence and the MXF file itself. The second partition contains the essence data as a sequence of KLV-wrapped frames. The final partition contains the index table
To
display
the
metadata
in
the
header
partition
of
an
MXF
file
testfile.mxf
,
use
klvwalk
like
so
$ klvwalk -r testfile.mxf ...
The following sections illustrate the expected output
As shown in Example 4.5 , the first structure to be output is the Partition Pack of the Header Partition. This structure documents the MXF version that the file conforms to and provides a description of the general architecture to be found inside
06.0e.2b.34.02.05.01.01.0d.01.02.01.01.02.04.00 len: 120 (ClosedCompleteHeader)1 MajorVersion = 1 MinorVersion = 2 KAGSize = 1 ThisPartition = 0 PreviousPartition = 0 FooterPartition = 218362864 HeaderByteCount = 16244 IndexByteCount = 0 IndexSID = 0 BodyOffset = 0 BodySID = 1 OperationalPattern = 060e2b34.0401.0101.0d010201.100000002 Essence Containers:3 060e2b34.0401.0103.0d010301.027f0100 060e2b34.0401.0107.0d010301.020b0100
The following table gives the list of valid Essence Container ULs for d-cinema Track File
UL Value | Container Type |
---|---|
060e2b34.0401.0101.0d010301.02060100 | Linear PCM Audio [SMPTE-429-3] , [SMPTE-382] |
060e2b34.0401.0107.0d010301.020c0100 | JPEG 2000 Images [SMPTE-429-4] |
060e2b34.0401.010a.0d010301.02130101 | Timed Text [SMPTE-429-5] |
060e2b34.0204.0101.0d010301.027e0100 | Encrypted Essence [SMPTE-429-6] |
An MXF file may contain zero or more continuous segments of essence data. Each segment is described by a Source Package structure. Per [SMPTE-429-3] , MXF files for digital cinema must contain exactly one top-level Source Package (thus one segment of essence), referred to in MXF jargon as a File Package. Example 4.6 shows a Source Package structure that points to JPEG 2000 essence data.
06.0e.2b.34.02.53.01.01.0d.01.01.01.01.01.37.00 len: 294 (SourcePackage)1 InstanceUID = 42b5a376-c740-42e2-99f1-4ec782c4837e PackageUID = [060a2b34.0101.0105.01010f20],13,00,00,00, [b4f492cd.b89b.0f65.490c35ec.5f6340b7]2 Name = File Package: SMPTE 429-4 frame wrapping of JPEG 2000 codestreams PackageCreationDate = 2007-03-21 07:42:04.000 PackageModifiedDate = 2007-03-21 07:42:04.000 Tracks:3 9227a330-7e64-4c90-b4ef-d057ed6ef159 0de983e3-255b-4d26-bde7-f33c530c077d 54e13d93-abcf-4869-b008-c59573b8d01d Descriptor = c6a35640-d6d8-433c-82c9-23df2eae9311 4
If the MXF file contains encrypted essence, the header metadata will contain one Cryptographic Framework set with a link to a single Cryptographic Context set (defined in [SMPTE-429-6] ). These structures are shown in Example 4.7
06.0e.2b.34.02.53.01.01.0d.01.04.01.02.01.00.00 len: 40 (CryptographicFramework)1 InstanceUID = b98ca683-2e49-4e6a-88ff-af33910ba334 ContextSR = 8dcd2f7b-fd0b-4602-bae7-806c82dcfd94 06.0e.2b.34.02.53.01.01.0d.01.04.01.02.02.00.00 len: 120 (CryptographicContext)2 InstanceUID = 8dcd2f7b-fd0b-4602-bae7-806c82dcfd94 ContextID = 3472d593-e9ff-4b2e-84ca-5303b5ce53f7 SourceEssenceContainer = 060e2b34.0401.0107.0d010301.020c01003 CipherAlgorithm = 060e2b34.0401.0107.02090201.010000004 MICAlgorithm = 060e2b34.0401.0107.02090202.010000005 CryptographicKeyID = c030f37a-bf84-496b-bdc2-81744205a944 6
060e2b34.0401.0107.02090201.01000000
060e2b34.0401.0107.02090202.01000000
If the MXF file contains image essence for DCI-compliant digital cinema, the header metadata will contain an RGBA Essence Descriptor (defined in [SMPTE-377-1] , with a strong link to a JPEG 2000 Picture SubDescriptor (defined in [SMPTE-422] . These structures are shown in Example 4.8
06.0e.2b.34.02.53.01.01.0d.01.01.01.01.01.29.00 len: 169 (RGBAEssenceDescriptor)1 InstanceUID = 18a47da5-53d1-4785-a91e-41155753a02f Locators: SubDescriptors: 05f80258-beb2-4769-b99a-af4d6c3895da LinkedTrackID = 2 SampleRate = 24/12 ContainerDuration = 7203 EssenceContainer = 060e2b34.0401.0107.0d010301.020c0100 Codec = 00000000.0000.0000.00000000.00000000 FrameLayout = 0 StoredWidth = 20484 StoredHeight = 10805 AspectRatio = 2048/1080 PictureEssenceCoding = 060e2b34.0401.0109.04010202.030101036 ComponentMaxRef = 4095 ComponentMinRef = 0 06.0e.2b.34.02.53.01.01.0d.01.01.01.01.01.5a.00 len: 174 (JPEG2000PictureSubDescriptor)7 InstanceUID = 05f80258-beb2-4769-b99a-af4d6c3895da Rsize = 3 Xsize = 2048 Ysize = 1080 XOsize = 0 YOsize = 0 XTsize = 2048 YTsize = 1080 XTOsize = 0 YTOsize = 0 Csize = 3 PictureComponentSizing = 00000003000000030b01010b01010b0101 CodingStyleDefault = 01040001010503030000778888888888 QuantizationDefault = 227f187f007f007ebc76ea76ea76bc6f4c6f4c6f645803580358455fd25fd25f61
If the MXF file contains audio essence for DCI-compliant digital cinema, the header metadata will contain a Wave Audio Descriptor (defined in [SMPTE-382] ). This structure is shown in Example 4.9 .
06.0e.2b.34.02.53.01.01.0d.01.01.01.01.01.48.00 len: 134 (WaveAudioDescriptor)1 InstanceUID = 0b7eac6c-85e2-47e4-b0bf-b3e60f6e6cd7 Locators: SubDescriptors: LinkedTrackID = 2 SampleRate = 24/12 ContainerDuration = 5283 EssenceContainer = 060e2b34.0401.0101.0d010301.02060100 AudioSamplingRate = 48000/14 Locked = 0 AudioRefLevel = 0 ChannelCount = 65 QuantizationBits = 246 DialNorm = 0 BlockAlign = 187 SequenceOffset = 0 AvgBps = 144000
All d-cinema Track Files end with a Random Index Pack (RIP). The RIP provides a lookup table that gives the location of all partitions in the file for easy random access. The number of partitions shown by the RIP should be three if the MXF file is a sound or picture Track File, and may be more than three for a Timed Text Track File.
06.0e.2b.34.02.05.01.01.0d.01.02.01.01.11.01.00 len: 40 (RandomIndexMetadata)11
0 : 0
1 : 16384
0
:
110688380
060e2b34.0205.0101.0d010201.01020400
060e2b34.0401.0102.0d010201.10000000
,
060e2b34.0401.0103.0d010301.027f0100
and
060e2b34.0401.0107.0d010301.020c0100
,
060e2b34.0401.0103.0d010301.027f0100
and
060e2b34.0401.0101.0d010301.02060100
,
060e2b34.0401.0103.0d010301.027f0100
and
060e2b34.0401.0107.0d010301.020b0100
,
060e2b34.0253.0101.0d010101.01012300
.
060e2b34.0253.0101.0d010101.01013700
060e2b34.0253.0101.0d010101.01013700
,
060e2b34.0253.0101.0d010101.01013a00
.
060e2b34.0253.0101.0d010101.01010f00
.
060e2b34.0253.0101.0d010101.01014100
.
060e2b34.0253.0101.0d010401.02010000
.
060e2b34.0253.0101.0d010401.02020000
.
060e2b34.0401.0107.0d010301.020c0100
060e2b34.0401.0101.0d010301.02060100
060e2b34.0205.0101.0d010201.01020400
060e2b34.0401.0102.0d010201.10000000
,
060e2b34.0401.0103.0d010301.027f0100
and
060e2b34.0401.010a.0d010301.02130101
,
060e2b34.0401.0103.0d010301.027f0100
and
060e2b34.0401.0107.0d010301.020b0100
,
060e2b34.0253.0101.0d010101.01012300
.
060e2b34.0253.0101.0d010101.01013700
060e2b34.0401.010a.0d010301.02130101
E.g.
$ asdcp-test -i -v <input-file> ... SampleRate: 24/1 ... ContainerDuration: 528 ... $
$ asdcp-test -i -v PerfectMovie-j2c-pt.mxf File essence type is JPEG 2000 pictures. ProductUUID: 43059a1d-0432-4101-b83f-736815acf31d ProductVersion: Unreleased 1.1.13 CompanyName: DCI ProductName: asdcplib EncryptedEssence: No AssetUUID: 0e676fb1-951b-45c4-8334-ed2c59199815 Label Set Type: SMPTE AspectRatio: 2048/1080 EditRate: 24/1 StoredWidth: 2048 StoredHeight: 1080 Rsize: 3 Xsize: 2048 Ysize: 1080 XOsize: 0 YOsize: 0 XTsize: 2048 YTsize: 1080 XTOsize: 0 YTOsize: 0 ContainerDuration: 240 Color Components: 11.1.1 11.1.1 11.1.1 Default Coding (16): 01040001010503030000778888888888 Quantization Default (33): 227f187f007f007ebc76ea76ea76bc6f4c6f4c6f645803580358455fd25fd25f61
$ asdcp-test -x first -d 1 -f 0 PerfectMovie-j2c-pt.mxf $ asdcp-test -x last -d 1 -f 239 PerfectMovie-j2c-pt.mxf $ ls first000000.j2c last000239.j2c PerfectMovie-j2c-pt.mxf
$ j2c-scan frame000000.j2c digital cinema profile: none rsiz capabilities: standard pixel offset from top-left corner: (0, 0) tile width/height in pixels: (2048, 1080) image width/height in tiles: (1, 1) tile #1 coding style: 1 progression order: Component-Position-Resolution-Layer POC marker flag: 0 number of quality layers: 1 rate for layer #1: 0.0 multi-component transform flag: 1 ...
klvwalk
to
display
the
length
of
every
WAVEssence
set
(UL
value
060e2b34.0102.0101.0d010301.16010101
)
and
checking
that
each
frame
contains
the
appropriate
number
of
bytes.
The
expected
number
of
Audio
Bytes
per
frame
can
be
calculated
by
using
the
formula
len=BPS*Ch*SPF,
where
BPS
is
the
number
of
Bytes
Per
Sample
(BPS=3),
Ch
is
the
number
of
Audio
Channels
in
the
DCP,
and
SPF
is
the
number
of
Samples
Per
Frame
value
taken
from
Table
4.2
.
If
any
frame
has
an
actual
len
that
differs
from
the
expected
value,
calculated
from
the
formula,
this
is
cause
to
fail
this
test.
The example below shows eight frames of a composition containing six channels of 48kHz samples at 24fps, completely wrapped in KLV triplets (3 * 6 * 2000 = 36000).
$klvwalk PerfectMovie-pcm-pt.mxf ... 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) 060e2b34.0102.0101.0d010301.16010101 len: 36000 (WAVEssence) ...The possible values for the Samples/Frame are shown in table below.
FPS | Sample Rate | Samples/Frames |
---|---|---|
24 | 28 kHz | 2000 |
24 | 96 kHz | 4000 |
48 | 48 kHz | 1000 |
48 | 96 kHz | 2000 |
$ klvwalk -r PerfectMovie-j2c-pt.mxf ... 060e2b34.0253.0101.0d010101.01012900 len: 169 (RGBAEssenceDescriptor) InstanceUID = 82141918-ce1b-47a5-ac13-c47cfb2e51a7 GenerationUID = 00000000-0000-0000-0000-000000000000 Locators: SubDescriptors: 92e96e5e-6bef-4985-8117-7dfa541f96fa LinkedTrackID = 2 SampleRate = 24/1 ContainerDuration = 240 EssenceContainer = 060e2b34.0401.0107.0d010301.020c0100 Codec = 060e2b34.0401.0109.04010202.03010103 FrameLayout = 0 StoredWidth = 2048 StoredHeight = 1080 AspectRatio = 2048/1080 ComponentMaxRef = 4095 ComponentMinRef = 0 ...The valid Image Structure Container values are shown in table below.
Operational level | Maximum Horizontal Pixels | MaximumVertical Pixels | Frames per Second |
---|---|---|---|
1 | 4096 | 2160 | 24 |
2 | 2048 | 1080 | 48 |
3 | 2048 | 1080 | 24 |
$ asdcp-test -d 1 -x frame j2c/PerfectMovie-j2c-pt.mxf $ j2c-scan frame_000001.j2c coding parameters digital cinema profile: none rsiz capabilities: standard pixel offset from top-left corner: (0, 0) tile width/height in pixels: (2048, 1080) image width/height in tiles: (1, 1) ...
PictureEssenceCodingfield
of
the
MXF
RGBAEssenceDescriptor
(see
6
in
Example
4.8
)
is
one
of:
060e2b34.0401.0109.04010202.03010103
(for
2K
images)
or
060e2b34.0401.0109.04010202.03010104
(for
4K
images).
$ asdcp-test -x frame j2c/PerfectMovie-j2c-pt.mxf $ ls j2c frame000000.j2c frame000057.j2c frame000124.j2c frame000191.j2c frame000001.j2c frame000058.j2c frame000125.j2c frame000192.j2c frame000002.j2c frame000059.j2c frame000126.j2c frame000192.j2c frame000003.j2c frame000060.j2c frame000127.j2c frame000194.j2c ...
$ j2c-scan frame000000.j2c digital cinema profile: none rsiz capabilities: standard pixel offset from top-left corner: (0, 0) tile width/height in pixels: (2048, 1080) image width/height in tiles: (1, 1) tile #1 coding style: 1 progression order: Component-Position-Resolution-Layer POC marker flag: 0 number of quality layers: 1 rate for layer #1: 0.0 multi-component transform flag: 1 ...
060e2b34.0253.0101.0d010101.01014800
.
An
example
is
shown
below.
$ klvwalk -r PerfectMovie-pcm-pt.mxf ... 060e2b34.0253.0101.0d010101.01014800 len: 134 (WaveAudioDescriptor) InstanceUID = e1c4c755-2c3e-4274-a3bf-581aadd63a4b GenerationUID = 00000000-0000-0000-0000-000000000000 Locators: SubDescriptors: LinkedTrackID = 2 SampleRate = 24/1 ContainerDuration = 480 EssenceContainer = 060e2b34.0401.0101.0d010301.02060100 Codec = 00000000.0000.0000.00000000.00000000 AudioSamplingRate = 48000/1 Locked = 0 AudioRefLevel = 0 ChannelCount = 6 QuantizationBits = 24 DialNorm = 0 BlockAlign = 18 SequenceOffset = 0 AvgBps = 144000 ...Verify the following:
060e2b34.0401.0101.0d010301.02060100
.
Any
other
value
is
cause
to
fail
this
test.
ChannelCount
*
3
.
Any
other
value
is
cause
to
fail
this
test.
AudioSamplingRate
*
ChannelCount
*
3
.
Any
other
value
is
cause
to
fail
this
test.
$ schema-check testfile.xml S428-7-2007.xsd $
$ ftlint 1 font_file.otf font_file.otf: OK. $
$ identify -verbose subpicture_0001.png Image: subpicture_0001.png Format: PNG (Portable Network Graphics) Geometry: 120x420 Class: DirectClass Colorspace: RGB Type: GrayscaleMatte Depth: 8 bits ...
<Size>
element
of
the
<Asset>
element.
Inconsistency
is
cause
to
fail
this
test.
<Hash>
element
of
the
<Asset>element.
Inconsistency
is
cause
to
fail
this
test.
The
following
is
an
example
using
the
asdcp-test
software
utility:
$ asdcp-test -t PerfectMovie-j2c-pt.mxf t0MirEHOVFF4Mi1IP0iYVjrvb14= PerfectMovie-j2c-pt.mxf
This chapter contains test procedures of security features that apply to more than one type of device. Procedures are given for Type 1 and Type 2 Secure Processing Block (SPB) physical security requirements, Intra-theater communications, and security log reporting.
The test procedures in this section apply to any device or component that is classified as a Type 1 or Type 2 SPB.
If the Test Subject is a Media Block:
Roles listed in the Subject Common Name | DigitalSignature flag | KeyEncipherment flag |
---|---|---|
includes the SM and MIC roles, but does not include any of the LS and RES roles | false | true |
includes the SM, MIC and RES roles, but does not include the LS role | false | true |
includes LS role | true | false |
For any other Test Subject:
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
24.2. SDR Projector Test Sequence | Pass/Fail | — |
24.4. SDR Projector Confidence Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
26.4. HDR Direct View Display Confidence Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
27.4. SDR Direct View Display Confidence Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
28.4. HDR Projector Confidence Sequence | Pass/Fail | — |
The section "SPB Type 2 Security Perimeter" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB Type 2 Secure Silicon" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "TLS Session Initiation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Auditorium Security Message Support" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM Failure Behavior" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'RRP Invalid'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'GetTime'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'GetEventList'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'GetEventID'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'LEKeyLoad'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'LEKeyQueryID'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'LEKeyQueryAll'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'LEKeyPurgeID'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'LEKeyPurgeAll'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "ASM 'GetProjCert'" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "TLS Exception Logging" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Secure Processing Block (SPB) modules are required to provide event log reports on demand. The log reports are XML documents (see Section 3.1 ) having a structure defined by [SMPTE-430-4] . This section will describe the report format and present procedures for testing general operational requirements for event logging.
The method of generating a log report will vary between implementations. Consult the manufacturer's documentation for log report generation instructions.
Standard d-cinema log reports are encoded as XML documents per [SMPTE-430-4] . The reports consist of a preamble, which identifies the device that created the report, and a sequence of log records. In log reports which contain security events (Security Event Logs), some of the log records may contain XML Signature elements. The report format includes many unique security features; the reader should study [SMPTE-430-4] in detail to understand how log authentication works.
The following subsections detail the major features of a log report
A
collection
of
one
or
more
log
records
is
presented
as
an
XML
document
having
a
single
LogReport
element
as
the
top-level
element.
The
log
report
begins
with
reportDate
and
reportingDevice
elements.
The
contents
of
the
elements
identify
the
time
the
log
was
created
and
the
device
that
created
the
log
<?xml version="1.0" encoding="UTF-8"?> <LogReport1 xmlns="http://www.smpte-ra.org/schemas/430-4/2008/LogRecord/"2 xmlns:dcml="http://www.smpte-ra.org/schemas/433/2008/dcmlTypes/"> <reportDate>2007-05-04T09:30:47-08:00</reportDate>3 <reportingDevice>4 <dcml:DeviceIdentifier idtype="CertThumbprint">YmVsc3dpY2tAZW50ZXJ0ZWNoLmNvbQ== </dcml:DeviceIdentifier> <dcml:DeviceTypeID scope="http://www.smpte-ra.org/schemas/433/2008/dcmlTypes#DeviceTypeTokens">SM </dcml:DeviceTypeID> <dcml:AdditionalID>vnqteTcB2Gji\+1Hl23sxxgOqvwE=</dcml:AdditionalID>5 <dcml:DeviceSerial>000000042</dcml:DeviceSerial>6 <dcml:ManufacturerCertID>rlpve6MSncWouNIpFcTSIhk6w2A=</dcml:ManufacturerCertID>7 <dcml:DeviceCertID>9czqa+0orIADHDIYxAkn/IcmZ3o=</dcml:DeviceCertID> <dcml:ManufacturerName>Acme Digital Cinema Inc.</dcml:ManufacturerName> <dcml:DeviceName>Mojo Media Block</dcml:DeviceName> <dcml:ModelNumber>MB-3000</dcml:ModelNumber> <dcml:VersionInfo> <dcml:Name>Bootloader</dcml:Name> <dcml:Value>1.0.0.0</dcml:Value> <dcml:Name>Security Module</dcml:Name> <dcml:Value>3.4.2.1</dcml:Value> </dcml:VersionInfo> </reportingDevice>
LogReport
element
is
the
root
element
of
a
log
report
document.
LogRecord
and
DCML
namespaces
are
used
Each
event
contained
in
the
log
report
is
encoded
as
a
LogRecordElement
element.
This
element
type
has
three
major
sub-elements:
LogRecordHeader
,
LogRecordBody
,
and
LogRecordSignature
.
The
first
two
are
shown
in
the
example
below,
the
last
is
the
subject
of
the
next
section.
The
log
record
element
defined
in
[SMPTE-430-4]
is
known
by
two
names.
The
correct
name
to
use
depends
on
context.
Testing
a
candidate
document
against
the
LogRecord
schema
will
verify
correct
use.
When
a
log
record
(defined
as
the
complex
type
logRecordType
in
the
LogRecord
schema)
appears
as
a
sub-element
of
a
LogReport
element,
the
record
element
name
is
LogRecordElement
.
When
a
log
record
appears
as
the
root
element
of
an
XML
document,
the
record
element
name
is
LogRecord
.
<LogRecordElement1 xmlns="http://www.smpte-ra.org/schemas/430-4/2008/LogRecord/" xmlns:dcml="http://www.smpte-ra.org/schemas/433/2008/dcmlTypes/"> <LogRecordHeader> <EventID>urn:uuid:8a221dfc-f5c6-426d-a2b8-9f6ff1cc6e31</EventID>2 <TimeStamp>2005-12-17T10:45:00-05:00</TimeStamp>3 <EventSequence>1000003</EventSequence>4 <DeviceSourceID> <dcml:PrimaryID idtype="CertThumbprint">kkqiVpDUAggQDHyHz0x9cDcsseU=</dcml:PrimaryID> </DeviceSourceID> <EventClass>http://www.smpte-ra.org/430.5/2007/SecurityLog/</EventClass>5 <EventType scope="http://www.smpte-ra.org/430.5/2007/SecurityLog/#EventTypes">Key</EventType>6 <contentId>urn:uuid:733365c3-2d44-4f93-accd-43cb39b0cedf</contentId>7 <previousHeaderHash>9czqa+0orIADHDIYxAkn/IcmZ3o=</previousHeaderHash>8 <recordBodyHash>9czqa+0orIADHDIYxAkn/IcmZ3o=</recordBodyHash>9 </LogRecordHeader> <LogRecordBody> <EventID>urn:uuid:8a221dfc-f5c6-426d-a2b8-9f6ff1cc6e31</EventID> <EventSubType scope="http://www.smpte-ra.org/430.5/2007/SecurityLog/#EventSubTypes-key"> KDMKeysReceived </EventSubType>10 <Parameters>11 <dcml:Parameter> <dcml:Name>SignerID</dcml:Name> <dcml:Value xsi:type="ds:DigestValueType">rlpve6MSncWouNIpFcTSIhk6w2A=</dcml:Value> </dcml:Parameter> </Parameters> <Exceptions>12 <dcml:Parameter> <dcml:Name>KDMFormatError</dcml:Name> <dcml:Value xsi:type="xs:string">XML validation failed on line 36</dcml:Value> </dcml:Parameter> </Exceptions> <ReferencedIDs>13 <ReferencedID> <IDName>CompositionID</IDName> <IDValue>urn:uuid:64bb6972-13a0-1348-a5e3-ae45420ea57d</IDValue> </ReferencedID> <ReferencedID> <IDName>KeyDeliveryMessageID</IDName> <IDValue>urn:uuid:64bb6972-13a0-1348-a5e3-ae45420ea57d</IDValue> </ReferencedID> </ReferencedIDs> </LogRecordBody> </LogRecordElement>
LogRecordElement
element
contains
a
single
log
record,
corresponding
to
a
single
system
event.
If
the
log
record
is
the
root
element
of
an
XML
document,
the
element
name
will
be
LogRecord
.
LogRecord
element
Header
element
in
the
record
that
preceded
this
one
in
the
report.
This
element
should
not
be
used
in
a
stand-alone
LogRecord
element
Body
element
contained
within
the
same
parent
LogRecordElement
or
LogRecord
element
An
XML
Signature
is
used
to
create
a
tamper-proof
encoding.
The
signature
is
made
over
the
contents
of
the
RecordAuthData
element
as
shown
in
the
following
example.
The
RecordAuthData
element
contains
the
digest
of
the
containing
record's
LogRecordHeader
element.
Consult
[SMPTE-430-4]
for
details
on
extending
the
signature's
proof
of
authenticity
to
preceding
records
via
the
contents
of
the
header's
previousHeaderHash
element.
<LogRecordSignature>1 <HeaderPlacement>stop</HeaderPlacement> <SequenceLength>2</SequenceLength> <RecordAuthData Id="ID_RecordAuthData">2 <RecordHeaderHash>SG93IE1hbnkgTW9yZSBSZXZpc2lvbnM/</RecordHeaderHash>3 <SignerCertInfo>4 <ds:X509IssuerName>CN=DistCo-ca,OU=DistCo-ra,O=DistCo-ra, dnQualifier=vnqteTcB2Gji\+1Hl23sxxgOqvwE=</ds:X509IssuerName> <ds:X509SerialNumber>16580</ds:X509SerialNumber> </SignerCertInfo> </RecordAuthData> <Signature>5 <ds:SignedInfo> <ds:CanonicalizationMethod Algorithm="http://www.w3.org/TR/2001/REC-xml-c14n-20010315" /> <ds:SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha256" /> <ds:Reference URI="#ID_RecordAuthData"> <ds:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1" /> <ds:DigestValue>VGhpcyBvbmx5IHRvb2sgdHdvIHllYXJz</ds:DigestValue> </ds:Reference> </ds:SignedInfo> <ds:SignatureValue> Vqe6MS0pHovkfqhHlkt/NNEI1GGchCW/EyqxOccSenuzNQc63qL+VIQoIJCcwgnE0i/w/8bIgjfB PrsOW5M3zlR0eAZc7tt6f7q50taNmC+O2wfATVXqEE8KC32qO//NQHuOL6bLLH+12oqgR5fS/mlI /wpn8s/pAtGA9lAXDRp03EVOvzwq0m9AjzOxIbgzGg6AIY0airJ1gecT1qccb1zGQjB81pr3ctlp ECchubtSCqh+frRn4CZc4ZRMLhjnax/zwHIG4ExiMCEKbwaz7DwN8zv1yoPUzut9ik7X0EyfRIlv F3piQoLeeFcFrkfNwYyyhTX8iHTO4Cz8YfGNyw==</ds:SignatureValue> <ds:KeyInfo> <ds:X509Data> <ds:X509IssuerSerial> <ds:X509IssuerName>Sample Issuer Name</ds:X509IssuerName> <ds:X509SerialNumber>1234567</ds:X509SerialNumber> </ds:X509IssuerSerial> <!-- X509 certificate value as block of Base64 encoded characters, --> <!-- truncated for brevity --> <ds:X509Certificate> QSBDZXJ0aWZpY2F0ZSB3b3VsZCBiZSBsb25nZXIgdGhhbiB0aGlz</ds:X509Certificate> </ds:X509Data> <ds:X509Data> <ds:X509IssuerSerial> <ds:X509IssuerName>Sample Issuer Name 2</ds:X509IssuerName> </ds:X509IssuerSerial> <!-- X509 certificate value as block of Base64 encoded characters, --> <!-- truncated for brevity --> <ds:X509Certificate>TG9uZ2VyIHRoYW4gdGhpcyB0b28sIGZvciBzdXJl</ds:X509Certificate> </ds:X509Data> </ds:KeyInfo> </Signature> </LogRecordSignature>
LogRecordSignature
contains
the
signature
of
a
log
record.
RecordAuthData
element
is
the
content
that
is
actually
signed
for
the
signature.
This
element
is
identified
for
the
signature
processor
by
the
Id
attribute
value
Header
element.
XML Signatures on log reports can be checked using the procedure in Section 3.1.3 .
The section "Log Record Proxy" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
$ schema-check <input-file> smpte-433.xsd smpte-430-4.xsd schema validation successful
$ uuid_check.py <input-file> all UUIDs conform to RFC-4122 $
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Log Records for Multiple Remote SPBs" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
EventSequence
number
to
each
log
record
it
creates.
Verify
that
this
EventSequence
number
appears
in
the
Header
node
of
each
log
record
in
a
report.
EventSequence
value
that
is
one
greater
than
the
value
in
the
previous
record.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Log Collection by the SM" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "General Log System Failure" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify that all Log Records within a Log Report are properly authenticated as specified in [SMPTE-430-4] and [SMPTE-430-5] .
Verify that the Log Report is signed by the SM.
Verify that EventID for a given event is maintained across collections.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
CPLStart
record.
Failure
for
the
records
in
the
two
reports
to
have
the
same
EventID
value
is
cause
to
fail
this
test.
Note:
The
following
steps
shall
use
the
Log
Report
extracted
in
Step
2
.
LogReport
.
Failure
of
this
verification
is
cause
to
fail
the
test.
recordBodyHash
elements
as
specified
in
Section
6.1.1.5
of
[SMPTE-430-5]
;
and
LogRecordSignature
element
as
specified
in
Section
7.3
of
[SMPTE-430-4]
and
Section
6.1.3
of
[SMPTE-430-5]
.
previousHeaderHash
(unless
the
Log
Record
is
the
first
of
a
sequence)
and
recordBodyHash
elements
as
specified
in
Section
6.1.1.5
of
[SMPTE-430-5]
;
LogRecordSignature
element
as
specified
in
Section
7.3
of
[SMPTE-430-4]
and
Section
6.1.3
of
[SMPTE-430-5]
.
LogRecordSignature
element.
Using
its
X509IssuerName
and
X509SerialNumber
from
the
SignerCertInfo
element,
locate
elements
that
match
in
one
of
the
KeyInfo
elements
and
extract
the
device
certificate
from
its
X509Certificate
element.
Absence
of
a
device
certificate
or
mismatched
X509IssuerName
and
X509SerialNumber
values
shall
be
cause
to
fail
the
test.
LogReport
element
contains
a
single
reportingDevice
child
element
as
defined
in
[SMPTE-430-4]
.
Failure
of
this
verification
is
cause
to
fail
this
test.
reportingDevice
element
meets
the
following
requirements.
Failure
to
meet
any
of
these
requirements
is
cause
to
fail
this
test.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"DeviceUID"
,
the
DeviceCertID
element
shall
also
be
present
and
shall
contain
the
certificate
thumbprint
of
the
SM
Certificate.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"DeviceUID"
,
it
shall
contain
the
device
UUID
of
the
Test
Subject.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"CertThumbprint"
,
it
shall
contain
the
certificate
thumbprint
of
the
SM
Certificate
of
the
Test
Subject.
AdditionalID
element
shall
be
present
and
its
value
set
to
the
certificate
thumbprint
of
the
LS
Certificate,
encoded
as
an
ds:DigestValueType
type.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
EventSequence
number
to
each
log
record
it
creates.
Verify
that
this
EventSequence
number
appears
in
the
Header
node
of
each
log
record
in
a
report.
EventSequence
value
that
is
one
greater
than
the
value
in
the
previous
record.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
Verify that the OBAE-capable Test Subject provides log event information in the form of Log Reports
Verify that all Log Records within a Log Report are properly authenticated as specified in [SMPTE-430-4] and [SMPTE-430-5] .
Verify that the Log Report is signed by the SM.
Verify that EventID for a given event is maintained across collections.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
If the Test Subject uses a single certificate implementation as defined in Section 9.5.1.1 of [DCI-DCSS] :
CPLStart
record.
Failure
for
the
records
in
the
two
reports
to
have
the
same
EventID
value
is
cause
to
fail
this
test.
Note:
The
following
steps
shall
use
the
Log
Report
extracted
in
Step
2
.
LogReport
.
Failure
of
this
verification
is
cause
to
fail
the
test.
recordBodyHash
elements
as
specified
in
Section
6.1.1.5
of
[SMPTE-430-5]
;
and
LogRecordSignature
element
as
specified
in
Section
7.3
of
[SMPTE-430-4]
and
Section
6.1.3
of
[SMPTE-430-5]
.
previousHeaderHash
(unless
the
Log
Record
is
the
first
of
a
sequence)
and
recordBodyHash
elements
as
specified
in
Section
6.1.1.5
of
[SMPTE-430-5]
;
LogRecordSignature
element
as
specified
in
Section
7.3
of
[SMPTE-430-4]
and
Section
6.1.3
of
[SMPTE-430-5]
.
LogRecordSignature
element.
Using
its
X509IssuerName
and
X509SerialNumber
from
the
SignerCertInfo
element,
locate
elements
that
match
in
one
of
the
KeyInfo
elements
and
extract
the
device
certificate
from
its
X509Certificate
element.
Absence
of
a
device
certificate
or
mismatched
X509IssuerName
and
X509SerialNumber
values
shall
be
cause
to
fail
the
test.
If the Test Subject uses a dual certificate implementation as defined in Section 9.5.1.2 of [DCI-DCSS] :
LogReport
element
contains
a
single
reportingDevice
child
element
as
defined
in
[SMPTE-430-4]
.
Failure
of
this
verification
is
cause
to
fail
this
test.
reportingDevice
element
meets
the
following
requirements.
Failure
to
meet
any
of
these
requirements
is
cause
to
fail
this
test.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"DeviceUID"
,
the
DeviceCertID
element
shall
also
be
present
and
shall
contain
the
certificate
thumbprint
of
the
SM
Certificate.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"DeviceUID"
,
it
shall
contain
the
device
UUID
of
the
Test
Subject.
idtype
attribute
of
the
DeviceIdentifier
element
is
equal
to
"CertThumbprint"
,
it
shall
contain
the
certificate
thumbprint
of
the
SM
Certificate
of
the
Test
Subject.
AdditionalID
element
shall
be
present
and
its
value
set
to
the
certificate
thumbprint
of
the
LS
Certificate,
encoded
as
an
ds:DigestValueType
type.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
The section "SM Proxy of Log Events" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SM Proxy of Security Operations Events" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SM Proxy of Security ASM Events" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Remote SPB Time Compensation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Secure Processing Blocks (SPB) are required to record Security Log Events (defined in [SMPTE-430-5] ) upon the occurrence of certain operational states. The procedures in this section should cause the Test Subject to record the respective events.
FrameSequencePlayed
events
per
[SMPTE-430-5]
.
Security
,
Type
Playout
,
Event
Subtype
FrameSequencePlayed
.
FrameSequencePlayed
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
FrameSequencePlayed
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLStart
events
per
[SMPTE-430-5]
.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
Security
,
Type
Playout
,
Event
Subtype
CPLStart
.
CPLStart
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLStart
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLEnd
events
per
[SMPTE-430-5]
.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
Security
,
Type
Playout
,
Event
Subtype
CPLEnd
.
CPLEnd
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLEnd
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
PlayoutComplete
events
per
[SMPTE-430-5]
.
Security
,
Type
Playout
,
Event
Subtype
PlayoutComplete
.
PlayoutComplete
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
PlayoutComplete
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLCheck
events
per
[SMPTE-430-5]
.
Security
,
Type
Validation
,
Event
Subtype
CPLCheck
.
CPLCheck
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLCheck
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
KDMKeysReceived
events
per
[SMPTE-430-5]
.
Security
,
Type
Key
,
Event
Subtype
KDMKeysReceived
.
KDMKeysReceived
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
KDMKeysReceived
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
KDMDeleted
events
per
[SMPTE-430-5]
.
Security
,
Type
Key
,
Event
Subtype
KDMDeleted
.
KDMDeleted
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
KDMDeleted
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
FrameSequencePlayed
events
per
[SMPTE-430-5]
.
Security
,
Type
Playout
,
Event
Subtype
FrameSequencePlayed
associated
with
the
OBAE
essence
in
DCI
2K
Sync
Test
(OBAE)
(Encrypted)
.
FrameSequencePlayed
record
has
correctly
recorded
parameters
as
defined
in
[SMPTE-430-5]
.
Parameters
list
of
the
FrameSequencePlayed
record
contains
a
name/value
pair
whose
Name
element
contains
the
token
OBAEMark
,
and
whose
Value
element
shall
contain
one
of
two
tokens,
either
true
or
false
,
indicating
that
a
forensic
mark
was
or
was
not
inserted
during
playout.
FrameSequencePlayed
as
detailed
above
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLStart
events
per
[SMPTE-430-5]
.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
Security
,
Type
Playout
,
Event
Subtype
CPLStart
.
CPLStart
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLStart
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLEnd
events
per
[SMPTE-430-5]
.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
Security
,
Type
Playout
,
Event
Subtype
CPLEnd
.
CPLEnd
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLEnd
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
PlayoutComplete
events
per
[SMPTE-430-5]
.
Security
,
Type
Playout
,
Event
Subtype
PlayoutComplete
.
PlayoutComplete
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
PlayoutComplete
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLCheck
events
per
[SMPTE-430-5]
.
Security
,
Type
Validation
,
Event
Subtype
CPLCheck
.
CPLCheck
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
CPLCheck
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
KDMKeysReceived
events
per
[SMPTE-430-5]
.
Security
,
Type
Key
,
Event
Subtype
KDMKeysReceived
.
KDMKeysReceived
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
KDMKeysReceived
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
KDMDeleted
events
per
[SMPTE-430-5]
.
Security
,
Type
Key
,
Event
Subtype
KDMDeleted
.
KDMDeleted
record
has
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
KDMDeleted
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
The section "LinkOpened Event" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LinkClosed Event" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LinkException Event" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LogTransfer Event" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KeyTransfer Event" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
SPBStartup
and
SPBShutdown
events
per
[SMPTE-430-5]
.
Security
,
Type
Operations
,
Event
Subtypes
SPBStartup
and
SPBShutdown
,
for
each
of
(a)
between
the
times
recorded
in
step
1
and
step
5
and
(b)
after
the
time
recorded
in
step
5.
SPBStartup
and
SPBShutdown
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBStartup
and
SPBShutdown
events
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
SPBOpen
and
SPBClose
events
per
[SMPTE-430-5]
.
Security
,
Type
Operations
,
Event
Subtypes
SPBOpen
and
SPBClose
.
SPBOpen
and
SPBClose
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBOpen
and
SPBClose
events
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
SPBClockAdjust
events
per
[SMPTE-430-5]
.
Security
,
Type
Operations
,
Event
Subtypes
SPBClockAdjust
.
SPBClockAdjust
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBClockAdjust
event
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
SPBMarriage
and
SPBDivorce
events
per
[SMPTE-430-5]
.
Security
,
Type
Operations
,
Event
Subtypes
SPBMarriage
and
SPBDivorce
.
SPBMarriage
and
SPBDivorce
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBMarriage
and
SPBDivorce
events
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
SPBSoftware
events
per
[SMPTE-430-5]
.
Security
,
Type
Operations
,
Event
Subtypes
SPBSoftware
.
SPBSoftware
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBSoftware
event
shall
be
cause
to
fail
this
test.
Security
,
Type
Operations
,
Event
Subtypes
SPBSoftware
.
SPBSoftware
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
SoftwareFailure
exception
in
the
SPBSoftware
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
SoftwareFailure
exception
in
the
associated
SPBSoftware
log
record
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
SPBSecurityAlert
log
events,
the
respective
log
records
contain
correctly
coded
SPBSecurityAlert
events
per
[SMPTE-430-5]
.
A
SPBSecurityAlert
record
indicates
an
event
that
is
not
described
by
one
of
the
other
event
record
types
defined
in
[SMPTE-430-5]
.
Each
Test
Subject
must
be
evaluated
to
determine
what
conditions
may
result
in
a
SPBSecurityAlert
event
being
logged.
Detailed
instructions
must
be
provided
by
the
manufacturer,
including
any
test
jigs
or
applications
that
may
be
required
to
perform
the
test.
SPBSecurityAlert
event
recording
the
condition.
Security
,
Type
Operations
,
Event
Subtypes
SPBSecurityAlert
.
Verify
that
the
SPBSecurityAlert
records
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
SPBSecurityAlert
record,
provide
an
explanation
of
the
condition
and
any
parameters
that
are
recorded.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Data only | — |
20.2. OMB Test Sequence | Data only | — |
21.2. Integrated IMBO Test Sequence | Data only | — |
The Media Block (MB) is a Type 1 SPB comprising a Security Manager (SM) and the Media Decryptors (MD) for all essence types, plus, as required, Forensic Marker (FM) for image or sound and a Timed Text rendering engine (alpha-channel overlay).
Some of the procedures in this section require test content that is specifically malformed. In some implementations, these malformations may be caught and reported directly by the SMS without involving the SM. Because the purpose of the procedures is to assure that the SM demonstrates the required behavior, the manufacturer of the Test Subject may need to provide special test programs or special SMS testing modes to allow the malformed content to be applied directly to the SM.
FrameSequencePlayed
records
for
each
track
file
included
in
the
composition
and
that
the
FirstFrame
and
LastFrame
parameter
values
reflect
the
interrupted
playback.
PlayoutComplete
event
associated
with
the
interrupted
playback.
FrameSequencePlayed
record
for
each
track
file
included
in
the
composition
and
that
the
FirstFrame
and
LastFrame
parameter
values
reflect
the
interrupted
playback.
PlayoutComplete
event
associated
with
the
interrupted
playback.
FrameSequenceError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
Record
any
additional
parameters
associated
with
the
exception.
TrackFileIDError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
CheckValueError
exception
in
the
FrameSequencePlayed
log
record
for
the
image
track
file.
Record
any
additional
parameters
associated
with
the
exception.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
FrameSequenceError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
Record
any
additional
parameters
associated
with
the
exception.
TrackFileIDError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
CheckValueError
exception
in
the
FrameSequencePlayed
log
record
for
the
sound
track
file.
Record
any
additional
parameters
associated
with
the
exception.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Restriction of Keying to Monitored Link Decryptors" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
FrameSequencePlayed
log
record
that
contains
a
KeyTypeError
exception.
Record
any
additional
parameters
associated
with
the
exception.
Failure
to
produce
correct
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
certificate
used
to
sign
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
AssetHashError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
AssetHashError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
SignatureError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
SignatureError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
certificate
used
to
sign
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
AssetMissingError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
AssetMissingError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
CPLFormatError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CPLFormatError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
CertFormatError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CertFormatError
exception
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
The section "Remote SPB Integrity Monitoring" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB Integrity Fault Consequences" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
This
test
will
require
KDMs
that
contain
ContentKeysNotValidAfter
elements
set
to
a
time
in
the
near
future.
It
is
recommended
that
fresh
KDMs
be
generated
that
will
expire
30-60
minutes
after
beginning
the
test
procedures.
Refer
to
information
provided
in
the
relevant
step
to
ensure
that
the
applicable
KDM
is
being
used
at
the
appropriate
absolute
time
the
step
of
the
test
is
carried
out.
The
Test
Operator
is
required
to
take
into
account
any
timezone
offsets
that
may
apply
to
the
locality
of
the
Test
Subject
and
the
representation
of
the
ContentKeysNotValidAfter
element
of
the
KDM.
For
clarity
it
is
recommended
that
a
common
representation
be
used.
The Security Manager's (SM) clock must be accurately set, to the extent possible, for successful execution of this test.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
<ContentKeysNotValidAfter>
element
(
i.e.
the
KDM's
end
of
validity
timestamp).
Note:
Steps
2
and
3
must
be
commenced
before
the
time
recorded
in
this
step
.
<ContentKeysNotValidAfter>
element
(
i.e.
the
KDM's
end
of
validity
timestamp).
Note:
Steps
5
and
6
must
be
commenced
before
the
time
recorded
in
this
step
.
FrameSequencePlayed
,
CPLEnd
and
PlayoutComplete
events.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
<ContentAuthenticator>
element.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
does
not
match
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
no
role.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
a
bad
role
(SM).
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
an
extra
role.
FrameSequencePlayed
events
associated
with
the
above
steps
and:
FrameSequencePlayed
log
records
that
contain
ContentAuthenticatorError
exceptions.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
ContentAuthenticatorError
exception
in
any
of
the
associated
FrameSequencePlayed
log
records
shall
be
cause
to
fail
this
test.
Only
for
the
operation
associated
with
step
2,
a
correctly
recorded
CPLCheck
log
record
with
a
CertFormatError
exception
is
an
allowable
substitute
for
a
FrameSequencePlayed
log
record
to
satisfy
the
requirements
of
this
step
of
the
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
ContentKeysNotValidBefore
and
ContentKeysNotValidAfter
elements.
FrameSequencePlayed
events
associated
with
the
above
steps
and:
FrameSequencePlayed
log
record
that
contains
a
ValidityWindowError
exception.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
ValidityWindowError
exception
in
any
of
the
associated
FrameSequencePlayed
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Verify
that
the
Test
Subject
checks
that
the
set
of
SPBs
configured
for
playout
is
consistent
with
the
TDL
(
AuthorizedDeviceInfo
element)
in
the
controlling
KDM.
FrameSequencePlayed
record
associated
with
the
image
track
file
produced
during
each
step,
and
confirm
that
all
required
elements
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
FrameSequencePlayed
record
contains
a
Parameter
element
with
a
Name
equal
to
DownstreamDevice
and
a
Value
equal
to
the
certificate
thumbprint
of
the
Imaging
Device
SPB.
FrameSequencePlayed
record
contains
a
TDLError
exception.
Record
all
parameters
associated
with
the
exception.
FrameSequencePlayed
record
associated
with
the
image
track
file
produced
during
each
step,
and
confirm
that
all
required
elements
have
correctly
formatted
parameters
as
defined
in
[SMPTE-430-5]
.
FrameSequencePlayed
record
does
not
contain
a
Parameter
element
with
a
Name
equal
to
DownstreamDevice
.
FrameSequencePlayed
record
contains
a
TDLError
exception.
Record
all
parameters
associated
with
the
exception.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
The KDMs specified to be used in this test additionally have one of each type of forensic marking keys FMIK and FMAK. Receiving devices shall process such keys in accordance with the individual implementation, in a manner that will not affect the requirements related to the maximum number of content keys (MDIK and MDAK).
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
CPLStart
and
last
CPLEnd
records
that
occurred
after
the
time
recorded
in
Step
3.
Let
Plaintext
Time
be
the
absolute
difference
between
the
TimeStamp
values
of
the
two
records.
CPLStart
and
last
CPLEnd
records
that
occurred
after
the
time
recorded
in
Step
9.
Let
Ciphertext
Time
be
the
absolute
difference
between
the
TimeStamp
values
of
the
two
records.
Ciphertext
Time
and
Plaintext
Time
is
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
<CompositionPlaylistId>
element
matches
the
value
of
the
CompositionPlaylistID
field
of
KDM
CipherData
structure
as
specified
in
[SMPTE-430-1]
<Id>
element
of
DCI
2K
StEM
(Encrypted)
and
(ii)
the
value
of
the
<CompositionPlaylistId>
element
matches
the
value
of
the
CompositionPlaylist
<Id>
element
of
DCI
2K
StEM
(Encrypted)
.
Attempt
to
play
DCI
2K
StEM
(Encrypted)
.
Successful
playback
is
cause
to
fail
this
test.
<Id>
element
of
DCI
2K
StEM
(Encrypted)
and
(ii)
the
value
of
the
<CompositionPlaylistId>
element
does
not
match
the
value
of
the
CompositionPlaylist
<Id>
element
in
DCI
2K
StEM
(Encrypted)
.
Attempt
to
play
DCI
2K
StEM
(Encrypted)
.
Successful
playback
is
cause
to
fail
this
test.
KDMKeysReceived
events
associated
with
the
above
steps
and:
KDMFormatError
exception
in
the
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
KDMFormatError
exception
in
any
of
the
associated
KDMKeysReceived
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
<CompositionPlaylistId>
element
matches
the
value
of
the
CompositionPlaylistId
field
of
KDM
CipherData
structure
as
specified
in
[SMPTE-430-1]
.
If the Test Subject is an OMB, the KDM targeting the associated IMB is valid, i.e. it is an instance of KDM for 2K StEM (Encrypted) (OBAE) .
CompositionPlaylistId
field
of
the
CipherData
structure
does
not
match
the
value
of
the
<Id>
element
of
DCI
2K
StEM
(OBAE)
(Encrypted)
and
(ii)
the
value
of
the
<CompositionPlaylistId>
element
matches
the
value
of
the
CompositionPlaylist
<Id>
element
of
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Attempt
to
play
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Successful
playback
is
cause
to
fail
this
test.
CompositionPlaylistId
field
of
the
CipherData
structure
matches
the
value
of
the
<Id>
element
of
DCI
2K
StEM
(OBAE)
(Encrypted)
and
(ii)
the
value
of
the
<CompositionPlaylistId>
element
does
not
match
the
value
of
the
CompositionPlaylist
<Id>
element
in
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Attempt
to
play
DCI
2K
StEM
(OBAE)
(Encrypted)
.
Successful
playback
is
cause
to
fail
this
test.
KDMKeysReceived
events
associated
with
the
above
steps
and:
KDMFormatError
exception
in
the
KDMKeysReceived
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
KDMFormatError
exception
in
any
of
the
associated
KDMKeysReceived
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
For each of the rows of Table 6.1 , perform the following steps in order:
FrameSequencePlayed
log
record
for
the
associated
Malformed
Track
File
and
that
the
record
contains
a
single
instance
of
the
specified
Exception
Token
.
PlayoutComplete
event
associated
with
the
playback.
Failure of any part of any of the steps above shall be cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
KeyType
of
the
key
is
not
equal
to
"MDEK"
.
FrameSequencePlayed
log
record
that
contains
a
KeyTypeError
exception.
Record
any
additional
parameters
associated
with
the
exception.
Failure
to
produce
correct
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
FrameSequenceError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
Record
any
additional
parameters
associated
with
the
exception.
TrackFileIDError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
Record
any
additional
parameters
associated
with
the
exception.
FrameMICError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
CheckValueError
exception
in
the
FrameSequencePlayed
log
record
for
the
OBAE
track
file.
Record
any
additional
parameters
associated
with
the
exception.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
This
test
requires
KDMs
that
contain
ContentKeysNotValidAfter
elements
set
to
a
time
in
the
near
future.
It
is
recommended
that
fresh
KDMs
be
generated
that
will
expire
30-60
minutes
after
beginning
the
test
procedures.
Refer
to
information
provided
in
the
relevant
step
to
ensure
that
the
applicable
KDM
is
being
used
at
the
appropriate
absolute
time
the
step
of
the
test
is
carried
out.
The
Test
Operator
is
required
to
take
into
account
any
timezone
offsets
that
may
apply
to
the
locality
of
the
Test
Subject
and
the
representation
of
the
ContentKeysNotValidAfter
element
of
the
KDM.
For
clarity
it
is
recommended
that
a
common
representation
be
used.
The Security Manager's (SM) clock must be accurately set, to the extent possible, for successful execution of this test.
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
Using
a
Text
Editor
,
open
the
KDM
KDM
for
Past
Time
Window
Extension
(OBAE)
(Encrypted)
and
note
the
value
of
the
timestamp
contained
in
the
<ContentKeysNotValidAfter>
element
(
i.e.
the
KDM's
end
of
validity
timestamp).
Note: Steps 2 and 3 must be commenced before the time recorded in this step .
<ContentKeysNotValidAfter>
element
(
i.e.
the
KDM's
end
of
validity
timestamp).
Note:
Steps
5
and
6
must
be
commenced
before
the
time
recorded
in
this
step
.
Within 5 minutes prior to the timestamp recorded in step 4, attempt to start playing End of Engagement - Within Time Window Extension (OBAE) (Encrypted) . The composition should start to playback and continue playing in its entirety. If the show fails to start or fails to playout completely, this is cause to fail this test.
Note:
The
test
operator
does
not
have
to
be
present
for
the
entire
playback.
Sufficient
proof
of
successful
playback
can
be
observed
by
examining
the
security
log
for
complete
FrameSequencePlayed
,
CPLEnd
and
PlayoutComplete
events.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
For simplicty, this test procedure uses same OBAE content for all Media Blocks (IMB, integrated IMB, IMBO and OMB) since the objective is to merely to determine whether playback occurs, and not whether a complete presentation occurred.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Failure of any of these above conditions is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
24.2. SDR Projector Test Sequence | Pass/Fail | — |
24.4. SDR Projector Confidence Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
26.4. HDR Direct View Display Confidence Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
27.4. SDR Direct View Display Confidence Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
28.4. HDR Projector Confidence Sequence | Pass/Fail | — |
The KDMs specified to be used in this test additionally have one of each type of forensic marking keys FMIK and FMAK. Receiving devices shall process such keys in accordance with the individual implementation, in a manner that will not affect the requirements related to the maximum number of content keys (MDIK and MDAK).
The
CPLStart
and
CPLEnd
records
are
triggered
by
the
first
and
last
edit
unit,
respectively,
of
the
CPL
reproduced
by
the
Test
Subject.
For
example,
in
the
case
of
an
OMB
with
OBAE
capability,
the
first
and
last
edit
units
of
the
CPL
are
OBAE
edit
units,
since
picture
edit
units
are
not
reproduced
despite
Main
Picture
assets
being
present
in
the
CPL
received
by
the
OMB.
CPLStart
and
last
CPLEnd
records
that
occurred
after
the
time
recorded
in
Step
3.
Let
Plaintext
Time
be
the
absolute
difference
between
the
TimeStamp
values
of
the
two
records.
CPLStart
and
last
CPLEnd
records
that
occurred
after
the
time
recorded
in
Step
9.
Let
Ciphertext
Time
be
the
absolute
difference
between
the
TimeStamp
values
of
the
two
records.
Ciphertext
Time
and
Plaintext
Time
is
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
certificate
used
to
sign
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
AssetHashError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
AssetHashError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
SignatureError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
SignatureError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
the
value
of
the
SignerID
parameter
contains
the
Certificate
Thumbprint
of
the
certificate
used
to
sign
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
AssetMissingError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
AssetMissingError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
CPLFormatError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CPLFormatError
exception
shall
be
cause
to
fail
this
test.
CPLCheck
event
associated
with
the
above
operation
and:
contentId
element
contains
the
Id
of
the
CPL.
Verify
that
ReferencedIDs
element
contains
a
CompositionID
parameter
with
a
value
that
is
the
Id
of
the
CPL.
Missing
required
elements
or
incorrect
parameters
shall
be
cause
to
fail
this
test.
CertFormatError
exception
in
the
CPLCheck
log
record.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
CertFormatError
exception
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
<ContentAuthenticator>
element.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
does
not
match
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
no
role.
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
a
bad
role
(SM).
<ContentAuthenticator>
element
having
a
certificate
thumbprint
value
that
matches
the
thumbprint
of
one
of
the
signer
certificates
in
the
certificate
chain
that
signed
the
associated
CPL
but
that
certificate
has
an
extra
role.
FrameSequencePlayed
events
associated
with
the
above
steps
and:
FrameSequencePlayed
log
records
that
contain
ContentAuthenticatorError
exceptions.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
ContentAuthenticatorError
exception
in
any
of
the
associated
FrameSequencePlayed
log
records
shall
be
cause
to
fail
this
test.
Only
for
the
operation
associated
with
step
2,
a
correctly
recorded
CPLCheck
log
record
with
a
CertFormatError
exception
is
an
allowable
substitute
for
a
FrameSequencePlayed
log
record
to
satisfy
the
requirements
of
this
step
of
the
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
ContentKeysNotValidBefore
and
ContentKeysNotValidAfter
elements.
FrameSequencePlayed
events
associated
with
the
above
steps
and:
FrameSequencePlayed
log
record
that
contains
a
ValidityWindowError
exception.
Record
any
additional
parameters
associated
with
the
exception.
A
missing
ValidityWindowError
exception
in
any
of
the
associated
FrameSequencePlayed
log
records
shall
be
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
The section "LDB Trust" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Special Auditorium Situation Operations" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LE Key Usage" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "MB Link Encryption" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
This section describes general requirements concerning the time awareness of the components of the theater system. All procedures are applicable to the Security Manager, with the notable exception of section 6.3.2 , which is applicable to all SPBs of Type 1.
The
following
procedures
are
likely
to
fail
if
the
Test
Subject
has
had
its
time
adjusted
since
manufacture.
The
current
time
may
not
be
centered
on
the
adjustment
range
zero
point.
Any
such
adjustments,
however,
will
be
evidenced
in
the
security
log
and
by
examining
the
relevant
TimeOffset
elements,
the
zero
point
can
be
derived
and
the
time
set
accordingly.
If
necessary,
contact
the
manufacturer
for
assistance
in
determining
and
setting
the
time
to
the
center
of
the
range
of
adjustment
for
the
current
calendar
year.
FrameSequencePlayed
event
caused
by
Step
2.
Subtract
the
value
of
the
time
recorded
in
Step
2
(UTC
time)
from
the
TimeStamp
from
the
LogRecord
(System
time).
Record
this
time
as
the
delta
of
System
time
to
UTC
time
for
the
unadjusted
state.
SPBClockAdjust
event
from
Step
3
and
confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
3
(UTC
time)
+
the
delta
from
Step
11
+
6
minutes.
SPBClockAdjust
event
from
Step
7
and
confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
7
(UTC
time)
+
the
delta
from
Step
11
-
6
minutes.
SPBClockAdjust
event
from
Step
5
and
confirm
the
presence
of
an
Exception
with
a
name
of
AdjustmentRangeError
.
Confirm
that
the
TimeStamp
contains
a
value
as
follows:
TimeOffset
parameter
shall
be
ignored.
SPBClockAdjust
event
from
Step
9
and
confirm
the
presence
of
an
Exception
with
a
name
of
AdjustmentRangeError
.
Confirm
that
the
TimeStamp
contains
a
value
as
follows:
TimeOffset
parameter
shall
be
ignored.
FrameSequencePlayed
event
caused
by
Step
4.
Confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
4
(UTC
time)
+
the
delta
from
Step
11
+
6
minutes.
FrameSequencePlayed
event
caused
by
Step
8.
Confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
8
(UTC
time)
+
the
delta
from
Step
11
-
6
minutes.
LogRecord
elements
for
Steps
11
through
17
shall
be
cause
to
fail
this
test.
Note:
The
TimeStamp
values
will
have
an
accuracy
that
depends
on
various
factors
such
as
system
responsiveness,
test
operator
acuity,
etc,
and
are
essentially
approximate.
The
intent
is
to
verify
that
the
TimeStamp
values
indeed
reflect
the
time
adjustments
.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
The phrase "record synchronized accurate time" used below means that the Test Operator records the value of the Accurate Real-Time Clock so as to determine a range of predictable deltas between the value of the Accurate Real-Time Clock and the timestamp in the log record that corresponds to an event. It is not important that the two times be equal, but that the difference be predictable to within a range that accommodates both variances in the responsiveness of the Test Subject for time stamping the logged operation and the accuracy of the Test Operator. Note: Each end of the range of the deltas is extended by an additional 2 seconds to allow for minor resolution inaccuracies of the testing methodology.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
The
following
procedures
are
likely
to
fail
if
the
Test
Subject
has
had
its
time
adjusted
since
manufacture.
The
current
time
may
not
be
centered
on
the
adjustment
range
zero
point.
Any
such
adjustments,
however,
will
be
evidenced
in
the
security
log
and
by
examining
the
relevant
TimeOffset
elements,
the
zero
point
can
be
derived
and
the
time
set
accordingly.
If
necessary,
contact
the
manufacturer
for
assistance
in
determining
and
setting
the
time
to
the
center
of
the
range
of
adjustment
for
the
current
calendar
year.
FrameSequencePlayed
event
caused
by
Step
2.
Subtract
the
value
of
the
time
recorded
in
Step
2
(UTC
time)
from
the
TimeStamp
from
the
LogRecord
(System
time).
Record
this
time
as
the
delta
of
System
time
to
UTC
time
for
the
unadjusted
state.
SPBClockAdjust
event
from
Step
3
and
confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
3
(UTC
time)
+
the
delta
from
Step
11
+
6
minutes.
SPBClockAdjust
event
from
Step
7
and
confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
7
(UTC
time)
+
the
delta
from
Step
11
-
6
minutes.
SPBClockAdjust
event
from
Step
5
and
confirm
the
presence
of
an
Exception
with
a
name
of
AdjustmentRangeError
.
Confirm
that
the
TimeStamp
contains
a
value
as
follows:
TimeOffset
parameter
shall
be
ignored.
SPBClockAdjust
event
from
Step
9
and
confirm
the
presence
of
an
Exception
with
a
name
of
AdjustmentRangeError
.
Confirm
that
the
TimeStamp
contains
a
value
as
follows:
TimeOffset
parameter
shall
be
ignored.
FrameSequencePlayed
event
caused
by
Step
4.
Confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
4
(UTC
time)
+
the
delta
from
Step
11
+
6
minutes.
FrameSequencePlayed
event
caused
by
Step
8.
Confirm
that
the
TimeStamp
contains
a
value
which
is
the
time
recorded
in
Step
8
(UTC
time)
+
the
delta
from
Step
11
-
6
minutes.
LogRecord
elements
for
Steps
11
through
17
shall
be
cause
to
fail
this
test.
Note:
The
TimeStamp
values
will
have
an
accuracy
that
depends
on
various
factors
such
as
system
responsiveness,
test
operator
acuity,
etc,
and
are
essentially
approximate.
The
intent
is
to
verify
that
the
TimeStamp
values
indeed
reflect
the
time
adjustments
.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
ForensicMarkFlagList
element
of
the
KDM
for
audio
and
image
Track
Files.
ForensicMarkFlagList
element
of
the
KDM
and
thus
the
"no
FM
mark"
state
applies
to
the
entire
CPL/composition,
according
to
the
associated
KDM.
Note: the equipment manufacturer is required to provide a suitable FM decoder ( i.e. , software and hardware).
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
An assessment of whether any allowed value for the time stamp and location data can be included in each 5 minute segments is impractical. For example, veryifying that all specified timestamp values are allowed would require testing to continue for a full calendar year. Instead a design review verifies that all specified timestamp and location values can be carried in the Forensic Marking.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
24.2. SDR Projector Test Sequence | Pass/Fail | — |
24.4. SDR Projector Confidence Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
26.4. HDR Direct View Display Confidence Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
27.4. SDR Direct View Display Confidence Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
28.4. HDR Projector Confidence Sequence | Pass/Fail | — |
ForensicMarkFlagList
"no
FM
mark"
or
"selective
audio
FM
mark"
commands.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
ForensicMarkFlagList
element
of
the
KDM
that
enables
playout.
ForensicMarkFlagList
URI
of
the
form
http://www.dcimovies.com/430-1/2006/KDM#mrkflg-audio-disable-above-channel-XX
(where
XX
is
a
value
in
the
set
{01,
02,
03,
04,
05,
06,
07,
08,
09,
10,
11,
12,
13,
14,
15,
16
...
99})
is
allowed
in
the
KDM
used
to
enable
the
selective
audio
FM
mark
command.
ForensicMarkFlagList
.
ForensicMarkFlagList
.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
FrameSequencePlayed
records
that
correspond
to
the
encrypted
track
files
played
during
the
presentation
segments
and:
FrameSequencePlayed
records
corresponding
to
the
first
segment
of
the
presentation
(plaintext
track
files
do
not
generate
these
records).
FrameSequencePlayed
records
corresponding
to
the
second
segment
of
the
presentation
contain
values
of
the
ImageMark
parameter
equal
to
"true"
and
do
not
contain
an
OBAEMark
parameter.
FrameSequencePlayed
records
corresponding
to
the
third
segment
of
the
presentation
contain
values
of
the
OBAEMark
parameter
equal
to
"true"
and
do
not
contain
an
ImageMark
parameter.
FrameSequencePlayed
records
corresponding
to
the
last
segment
of
the
presentation:
ImageMark
parameter
with
value
"true"
and
do
not
contain
an
OBAEMark
parameter;
and
OBAEMark
parameter
with
value
"true"
and
do
not
contain
an
ImageMark
parameter
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
ForensicMarkFlagList
element
of
the
KDM:
ForensicMarkFlagList
element
of
the
KDM.
FrameSequencePlayed
records
corresponding
to
the
playback
and:
FrameSequencePlayed
records
corresponding
to
the
playback
of
2K
FM
Control
Granularity
-
No
FM
(OBAE)
(Encrypted)
:
ImageMark
parameter
with
value
"false"
and
do
not
contain
an
OBAEMark
parameter;
and
OBAEMark
parameter
with
value
"false"
and
do
not
contain
an
ImageMark
parameter.
FrameSequencePlayed
records
corresponding
to
the
playback
of
2K
FM
Control
Granularity
-
Image
Only
FM
(OBAE)
(Encrypted)
:
ImageMark
parameter
with
value
"true"
and
do
not
contain
an
OBAEMark
parameter;
and
OBAEMark
parameter
with
value
"false"
and
do
not
contain
an
ImageMark
parameter.
FrameSequencePlayed
records
corresponding
to
the
playback
of
2K
FM
Control
Granularity
-
OBAE
Only
FM
(OBAE)
(Encrypted)
:
ImageMark
parameter
with
value
"false"
and
do
not
contain
an
OBAEMark
parameter;
and
OBAEMark
parameter
with
value
"true"
and
do
not
contain
an
ImageMark
parameter.
FrameSequencePlayed
records
corresponding
to
the
playback
of
2K
FM
Control
Granularity
-
Image
and
OBAE
FM
(OBAE)
(Encrypted)
:
ImageMark
parameter
with
value
"true"
and
do
not
contain
an
OBAEMark
parameter;
and
OBAEMark
parameter
with
value
"true"
and
do
not
contain
an
ImageMark
parameter.
Note: the equipment manufacturer is required to provide a suitable FM decoder ( i.e. , software and hardware).
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
An assessment of whether any allowed value for the time stamp and location data can be included in each 5 minute segments is impractical. For example, veryifying that all specified timestamp values are allowed would require testing to continue for a full calendar year. Instead a design review verifies that all specified timestamp and location values can be carried in the Forensic Marking.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
ForensicMarkFlagList
"no
FM
mark"
commands.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Verify that the playback system allows click free splicing of the audio track files.
Note: Playback of this test must be done in a properly equipped and set up movie theater, at reference level, i.e. , fader setting 7.0 for Dolby and compatibles or fader setting 0 dB for Sony and compatibles. A single channel of pink noise at -20dBFS should produce a Sound Pressure Level (SPL) of 85dBc, from any of the front loudspeakers, at the monitoring position. Monitoring by means of smaller monitor boxes or headphones is not sufficient.
Play back DCP for Audio Tone Multi-Reel (Encrypted) , which contains a sequence of audio track files arranged such that no discontinuity exists at the splice points.
Any audible snap, crackle, pop or other unpleasant artifact at any splice point shall be cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Timed Text Synchronization" was deleted. The section number is maintained here to preserve the numbering of subsequent sections
The section "Support for Multiple Captions" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Only applies to a Test Subject that implements an alpha channel overlay module, a subpicture renderer (a module that converts the subpicture file into a baseband image file with an alpha channel) and a Timed Text renderer (a module that converts Timed Text data into a baseband image file with an alpha channel).
Verify that the Test Subject provides a default font to be used in the case where no font files are supplied with the DCP.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
24.2. SDR Projector Test Sequence | Pass/Fail | — |
The section "Support for Subpicture Display" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Verify that the playback system allows click free splicing of OBAE track files.
Playback of this test must be done in a theatrical environment calibrated and setup for OBAE reproduction. Monitoring by means of smaller monitor boxes or headphones is not sufficient.
Any audible snap, crackle, pop or other unpleasant artifact at any splice point shall be cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Figure 6.2 shows what a typical measurement is expected to look like. The upper trace shows the light output of the Imaging Device, measured by means of the photo diode. The photo diode signal is shown inverted, i.e. , low means high light output. The lower trace shows the analog center channel output.
The optical flashes generated during this test can cause physiological reactions in some people. People who are sensitive to such optical stimuli should not view the test material.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Verify that the playback system supports playback of OBAE content that consists of maximum size frames, as defined in [SMPTE-429-18] .
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Verify that the OBAE Sound System meets acoustic rendering expectations.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
When making image measurements on any Imaging Device:
Stray light on the screen shall be minimized. The room lights in test environment shall be turned off, with the exception of the minimal lighting provided for working or safety reasons. The use of black nonreflective surfaces with recessed lighting is encouraged.
Note that, outside of the Test Environment, e.g. in exhibition theaters or review rooms, safety regulations and the placement of exit lights or access lights can result in a higher ambient light level.
With the Projector turned off or the douser closed, the level of ambient light reflected by the screen shall be:
The screen shall be non-specular and equally reflective over the entire visible spectrum. The screen should have variable black masking, adjustable to tightly frame the projected image (at a minimum, this should include the 1.85:1 and 2.39:1 image formats).
When making image measurements on a Projector:
7.5.13. Projector Test Environment records information about the test environment in which projector test procedures were conducted.
With the Direct View Display turned off, the level of ambient light reflected by the screen shall be less than 0.0005 cd/m².
The Direct View Display shall be turned on and allowed to thermally stabilize for 20 to 30 minutes prior to all measurements.
7.5.30. Direct View Display Test Environment records information about the test environment in which the test procedures were conducted.
When performing stereoscopic measurements:
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
24.4. SDR Projector Confidence Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
26.4. HDR Direct View Display Confidence Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
27.4. SDR Direct View Display Confidence Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
28.4. HDR Projector Confidence Sequence | Pass/Fail | — |
The section "SPB2 Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections
The section "SPB2 Secure Silicon Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB2 Tamper Evidence" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
The section "Projector Companion SPB Location" was deleted. The section number is maintained here to preserve the numbering of subsequent sections
DeviceSourceID
element
contains
the
Certificate
Thumbprint
of
the
companion
SPB.
DeviceConnectedID
element
contains
the
Certificate
Thumbprint
of
the
Imaging
Device
SPB2.
AuthId
record.
DeviceSourceID
element
contains
the
Certificate
Thumbprint
of
the
companion
SPB.
DeviceConnectedID
element
contains
the
Certificate
Thumbprint
of
the
Imaging
Device
SPB2.
AuthId
record.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
This section only applies to systems that implement an Electronic Marriage, i.e. , those that have field replaceable companion MBs.
In the case of an MB that is married to an Imaging Device SPB and implements dual certificates as defined in Section 9.5.1.2 of [DCI-DCSS] :Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Remote SPB Clock Adjustment" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB without Electronic Marriage" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB TLS Session Constraints" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB Time-Awareness" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB ASM Conformity" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB Key Storage" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB Key Purging" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LDB Logging" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Projector Overlay" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Projector Lens" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The 4k pattern consists of a 256 x 135 grid of 16 x 16 pixel arrays. A single-pixel white border surrounds the pattern.
Within each block, color-coded bands mark pixel positions. The bands may have North, South, East or West orientation (the example blocks have South orientation). Pixel positions are coded left to right (top to bottom for East and West orientations) with the following color sequence: brown, red, orange, yellow, green, blue, violet, gray.
Note: North, South, East and West orientations are provided in the test materials set to support investigation of anomalies.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
24.4. SDR Projector Confidence Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
26.4. HDR Direct View Display Confidence Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
27.4. SDR Direct View Display Confidence Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
28.4. HDR Projector Confidence Sequence | Pass/Fail | — |
The section "Projector Spatial Resolution and Frame Rate Conversion" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "White Point Luminance and Uniformity" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "White Point Chromaticity and Uniformity" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Sequential Contrast" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Data only | — |
26.2. HDR Direct View Display Test Sequence | Data only | — |
27.2. SDR Direct View Display Test Sequence | Data only | — |
28.2. HDR Projector Test Sequence | Data only | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Step Number | Nominal Luminance above the Screen Black Level (cd/m²) | Tolerance |
---|---|---|
1 | 0.121 | ±5% |
2 | 0.731 | ±5% |
3 | 2.098 | ±3% |
4 | 4.432 | ±3% |
5 | 7.917 | ±3% |
6 | 12.718 | ±3% |
7 | 18.988 | ±3% |
8 | 26.870 | ±3% |
9 | 36.497 | ±3% |
10 | 47.999 | ±3% |
Step Number | Nominal Luminance above the Screen Black Level (cd/m²) | Tolerance |
---|---|---|
1 | 0.006 | ±20% |
2 | 0.038 | ±5% |
3 | 0.111 | ±5% |
4 | 0.234 | ±5% |
5 | 0.418 | ±5% |
6 | 0.670 | ±5% |
7 | 1.002 | ±3% |
8 | 1.418 | ±3% |
9 | 1.928 | ±3% |
10 | 2.531 | ±3% |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Any measurement outside of specified tolerances is caused to fail this test.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | All data recorded per the test procedure |
28.2. HDR Projector Test Sequence | Pass/Fail | All data recorded per the test procedure |
Any verification that fails is cause to fail this test.
Parameter | Test Subject | |
---|---|---|
Projector | Direct View Display | |
Center luminance (cd/m²) | 299.6 ± 18 | 299.6 ± 9 |
Center chrominance ( x , y ) | (0.3127 ± 0.002, 0.3290 ± 0.002) |
Parameter | Test Subject | |
---|---|---|
Projector | Direct View Display | |
Maximum 𝒩 | 15% | 6% |
Maximum Δ u ′ v ′ | 0.0182 |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Parameter | Test Subject | |
---|---|---|
Projector | Direct View Display | |
Center luminance (cd/m²) | 48.0 ± 3.5 | |
Center chrominance ( x , y ) | (0.314 ± 0.002, 0.351 ± 0.002) |
Parameter | Test Subject | |
---|---|---|
Projector | Direct View Display | |
Maximum 𝒩 | 20% | 6% |
Maximum Δ u ′ v ′ | 0.0171 |
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Any measurement outside of specified tolerances is caused to fail this test.
Patch | Nominal values (cd/m²) | Tolerances | |
---|---|---|---|
Projector | Direct View Display | ||
Red-1 | 68.13 | ±6% | ±3% |
Green-1 | 207.35 | ||
Blue-1 | 23.86 |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Angular positions | Luminance change ratio tolerance | Viewing-angle color variation tolerance |
---|---|---|
+10° vertically (up) | Full Screen Vertical Off-Axis Luminance Uniformity at [DV-ADD] | Full Screen Vertical Off-Axis White Chromaticity Uniformity at [DV-ADD] |
-35° vertically (down) | ||
-60° horizontally (left) | Full Screen Horizontal Off-Axis Luminance Uniformity at [DV-ADD] | Full Screen Horizontal Off-Axis White Chromaticity Uniformity at [DV-ADD] |
+60° horizontally (right) |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
This test procedure only applies to a Test Subject that supports stereoscopic presentations.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
This test procedure only applies to a Test Subject that supports SDR stereoscopic presentations.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
If the measurement device or procedure reports the diffuse reflectivity at different optical wavelengths, the weighted average using the CIE Y Color Matching Function shall be used to combine different values at different wavelengths into a single diffuse reflectivity value that is photometrically weighted.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
This test procedure only applies to a Test Subject that supports SDR stereoscopic presentations.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any measurement outside of specified tolerances is caused to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Step Number | Nominal Luminance (cd/m²) | Tolerance | |
---|---|---|---|
Projector | Direct View Display | ||
1 | 0.50 | ±5% | ±12% |
2 | 1.00 | ±5% | ±12% |
3 | 2.00 | ±3% | ±6% |
4 | 5.00 | ±3% | ±6% |
5 | 9.99 | ±3% | ±6% |
6 | 20.00 | ±3% | ±6% |
7 | 50.01 | ±3% | ±6% |
8 | 100.10 | ±3% | ±6% |
9 | 200.21 | ±3% | ±6% |
10 | 299.64 | ±3% | ±6% |
Step Number | Nominal Luminance (cd/m²) | Tolerance | |
---|---|---|---|
Projector | Direct View Display | ||
1 | 0.0050 | ±20% | ±20% |
2 | 0.0075 | ±20% | ±20% |
3 | 0.0100 | ±20% | ±20% |
4 | 0.0151 | ±20% | ±20% |
5 | 0.0202 | ±5% | ±12% |
6 | 0.0352 | ±5% | ±12% |
7 | 0.0501 | ±5% | ±12% |
8 | 0.0752 | ±5% | ±12% |
9 | 0.0998 | ±5% | ±12% |
10 | 0.1997 | ±5% | ±12% |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Failure to record any data is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | All data recorded per the test procedure |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | All data recorded per the test procedure |
Any verification that fails is cause to fail this test.
Patch | Allowable luminance range (cd/m²) | |
---|---|---|
Projector | Direct View Display | |
SDR dark | [0.01, 0.032] | [0.01, 0.024] |
SDR light | 15.20 ± 0.46 | |
HDR dark | 0.005 ± 0.001 | |
HDR light | 299.6 ± 18 | 299.6 ± 9 |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
24.2. SDR Projector Test Sequence | Pass/Fail | — |
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
This test procedure only applies to Test Subject that support stereoscopic presentations.
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
27.2. SDR Direct View Display Test Sequence | Pass/Fail | — |
Any verification that fails is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
26.2. HDR Direct View Display Test Sequence | Pass/Fail | — |
28.2. HDR Projector Test Sequence | Pass/Fail | — |
A Screen Management System (SMS) (or Theater Management System (TMS)) is responsible for providing the operator's interface for ingest, scheduling, reporting, etc. In this document the term SMS will be used exclusively, although the same test procedures can apply to a TMS that is able to directly manage a suite of equipment for a screen.
The SMS is not hosted on secure hardware ( i.e. , it is not required to be within an SPB).
Verify that the system provides an interface to the storage system, for DCP ingest, that is Ethernet, 1Gb/s or better, over copper (1000Base-T) or fiber (1000Base-FX), as described in [IEEE-802-3] , running the TCP/IP protocol.
Sequence | Type | Measured Data |
---|
Sequence | Type | Measured Data |
---|
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
FrameSequencePlayed
and
PlayoutComplete
events
recorded
during
the
playback
for
complete
and
successful
reproduction.
Any
exceptions
or
missing
FrameSequencePlayed
or
PlayoutComplete
events
are
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Storage System Redundancy (OBAE)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
FrameSequencePlayed
and
PlayoutComplete
events
recorded
during
the
playback
for
complete
and
successful
reproduction.
Any
exceptions
or
missing
FrameSequencePlayed
or
PlayoutComplete
events
are
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "Screen Management System" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Data only | — |
21.2. Integrated IMBO Test Sequence | Data only | — |
$ schema-check <input-file> schema validation successful
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
The section "KDM Validity Checks" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | Record the available operator roles (names) and whether locally-defined accounts can be created. |
21.2. Integrated IMBO Test Sequence | Pass/Fail | Record the available operator roles (names) and whether locally-defined accounts can be created. |
FrameSequencePlayed
playout
events.
FrameSequencePlayed
event
for
both
audio
and
image
and
that
they
each
contain
a
parameter
named
AuthId
with
a
value
that
is
not
absent.
AuthId
value.
Any
missing
AuthId
parameter
or
any
AuthId
parameter
that
has
a
value
that
is
unpopulated
is
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
SMS
role,
or
the
TMS
role,
unless
the
SMS
is
contained
within
an
SPB
meeting
the
protection
requirements
for
any
other
designated
roles.
SMS
role,
or
the
TMS
role,
is
cause
to
fail
the
test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
Two instances of each KDM listed below are needed if the Test Subject is an OMB: one instance of each KDM for the IMB and one instance of each KDM for the OMB.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
20.4. OMB Confidence Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
For each of the rows of Table 8.2 , create a Show Playlist with the Composition and attempt to play it using the Malformed KDM . If playback begins this is cause to fail this test.
Composition | Malformed KDM |
---|---|
sync_test_with_subs_ct.cpl.xml | m0100_missing_key_pict.kdm.xml |
sync_test_with_subs_ct.cpl.xml | m0102_missing_key_snd.kdm.xml |
sync_test_with_subs_ct.cpl.xml | m0104_missing_key_sub.kdm.xml |
2K_sync_test_with_subs_obae_ct.cpl.xml | m0106_missing_key_pict_obae.kdm.xml |
2K_sync_test_with_subs_obae_ct.cpl.xml | m0108_missing_key_snd_obae.kdm.xml |
2K_sync_test_with_subs_obae_ct.cpl.xml | m0110_missing_key_sub_obae.kdm.xml |
2K_sync_test_with_subs_obae_ct.cpl.xml | m0112_missing_key_obae_obae.kdm.xml |
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Failure of any of these above conditions is cause to fail this test.
Sequence | Type | Measured Data |
---|---|---|
15.2. Integrated IMB Test Sequence | Pass/Fail | — |
15.4. Integrated IMB Confidence Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
21.4. Integrated IMBO Confidence Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
21.2. Integrated IMBO Test Sequence | Pass/Fail | — |
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Data only | — |
21.2. Integrated IMBO Test Sequence | Data only | — |
The section "Automation Control and Interfaces (OBAE)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
FrameSequencePlayed
playout
events.
FrameSequencePlayed
event
for
both
audio
and
image
and
that
they
each
contain
a
parameter
named
AuthId
with
a
value
that
is
not
absent.
AuthId
value.
Any
missing
AuthId
parameter
or
any
AuthId
parameter
that
has
a
value
that
is
unpopulated
is
cause
to
fail
this
test.
Sequence | Type | Measured Data |
---|---|---|
20.2. OMB Test Sequence | Pass/Fail | — |
Type 1 Secure Processing Blocks (SPB) are required by DCI to conform to a U.S. National Institute of Standards and Technology (NIST) FIPS 140 version in effect at the time of DCI compliance testing. Testing for compliance with FIPS 140 is performed by independent CMVP testing laboratories accredited by NIST NVLAP.
In May 2019, NIST announced the plan and schedule to migrate the security requirements for cryptographic modules from FIPS 140-2 to FIPS 140-3 . In order to simplify accommodation of this Chapter 9. FIPS Requirements for a Type 1 SPB for FIPS 140-2 and FIPS 140-3 (and references to these documents throughout the CTP), FIPS 140-2 and FIPS 140-3 references have been revised to refer generically to FIPS 140 , unless otherwise noted.
The testing program, known as the Cryptographic Module Validation Program (CMVP), is a joint effort of NIST's Computer Systems Laboratory (CSL) and the Communications Security Establishment (CSE) of the Government of Canada. More information about CMVP can be found on the NIST web site at http://csrc.nist.gov/groups/STM/. To be compliant with the DCI System Specification, a Type 1 SPB device must be tested by an accredited CMVP testing laboratory, the resulting documentation must be submitted to NIST/CSE for examination, and a validation certificate must be issued by NIST/CSE. Throughout this document, the term "FIPS 140-2 testing" will refer to this entire process.
FIPS 140 testing is very thorough but also very selective. To determine whether Type 1 SPB meets the DCI requirements, the documents prepared for and presented to the CMVP testing laboratory by the manufacturer must be reviewed by an examiner as guided by the requirements presented in this chapter. This chapter will briefly explain the FIPS testing process and the documentation that is produced. A procedure will be presented that will guide the examiner through the task of evaluating a FIPS 140 test report and determining the DCI compliance status of the respective Test Subject.
This section will explain the process of obtaining a FIPS 140 validation certificate from NIST/CSE. This information is intended to guide the examiner in understanding the documentation that will be produced in that process. This information is not exhaustive and is not intended to guide a manufacturer in obtaining a validation certificate. The following sub-sections illustrate the tasks in a typical validation process.
NIST makes available the list of accredited CMVP testing laboratories on the agency web site (see http://csrc.nist.gov/groups/STM/ testing_labs/index.html). Any of the testing laboratories can be used, but some restrictions may apply. For example, a laboratory that is owned by the Test Subject manufacturer or one that contributed to the design of the Test Subject will be disqualified from testing that Test Subject. More information about CMVP testing laboratories and CMVP testing laboratory selection can be found in Frequently Asked Questions for the Cryptographic Module Validation Program (http://csrc.nist.gov/groups/STM/cmvp/documents/ CMVPFAQ.pdf).
The FIPS 140 validation test report prepared by the CMVP testing laboratory is a proprietary and closely controlled document. The manufacturer must ensure that it has permission to disclose the test report to the Testing Organization.
The manufacturer is responsible for implementing a compliant design, and submitting required testing evidence to the CMVP testing laboratory for review and testing
Additionally, the manufacturer may be required to develop test jigs to facilitate the error injection process; for example, to simulate tamper events and other hardware failures.
The CMVP maintains a list of all cryptographic modules validated to FIPS 140 requirements. This list is published online at http://csrc.nist.gov/groups/STM/cmvp/validation.html. The CMVP also maintains a list of cryptographic modules currently undergoing FIPS 140 testing (a listing on the CMVP pre-validation website does not equate to having a FIPS 140-2 validation). The pre-validation list is at http://csrc.nist.gov/groups/STM/cmvp/inprocess.html.
The CMVP testing laboratory will review and analyze design materials during the validation testing process. The following list shows the documents generally expected to be submitted.
A FIPS 140 validation test report is created by CMVP testing laboratory engineers for submission to CMVP. The report details the documentation received and the test engineer's evaluation of the implementation's fidelity to the documentation and FIPS 140 requirements. The module tested receives a FIPS 140 validation certificate (i.e., either [FIPS-140-2] or [FIPS-140-3] ) once the CMVP reviews and approves the test report.
The CMVP testing laboratory assessments contained within a FIPS 140 validation test report address each of the applicable "TE" requirements corresponding to the eleven areas specified in the FIPS 140 Derived Test Requirements (DTR). These requirements instruct the tester as to what he or she must do in order to test the cryptographic module with respect to the given assertion (which is a statement that must be true for the module to satisfy the requirement of a given area at a given level).
For each applicable FIPS 140 "TE", the tester's assessment includes:
The Testing Organization must obtain an official copy of the FIPS 140 validation test report directly from the CMVP testing laboratory that performed the testing. The Test Operator must verify that the name of the cryptographic module and version (software, hardware, firmware) under review are identical to the versions reviewed for the FIPS 140 validation certificate, and supporting CAVP algorithm validation certificate(s).
To confirm whether the cryptographic module satisfies the DCI requirements, the Test Operator must review the "TE" assessments (and associated references as needed) that are relevant to corresponding DCI requirements (the specific assessments are located below with the respective DCI requirements. The functionality described must be consistent with the observed implementation.
Each of the subsections below describes a DCI requirement that must be proven by examining the FIPS 140 validation report. For each requirement, observe the design of the respective system element (with the aid of the Test Subject Representative) and record whether or not the design meets the requirement.
Verify that the Security Manager (SM) operating environment is limited to the FIPS 140 "limited operational" or "non-modifiable operational" environment category.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "LE Key Generation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
For components of the system designated Type 1 SPB , verify the following:
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "Security Design Description Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB1 Tamper Resistance" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
For components of the system designated Type 1 SPB , verify the following: the component meets and is certified for the requirements of FIPS 140 Level 3 in all areas except those subject to the exceptions or additional notes as specified in [DCI-DCSS] , Section 9.5.2.5.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "SPB1 Secure Silicon FIPS Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
For components of the system designated Type 1 SPB , verify that keys are generated as specified in [RFC-3447] and per the requirements of FIPS 140 "Cryptographic Key Management" and the [DCI-DCSS] , Section 9.5.2.5.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that the following Critical Security Parameters (CSPs) receive Secure Processing Block (SPB) Type 1 protection, whenever they exist outside of their originally encrypted state, in accordance with IPS-140 and the requirements of [DCI-DCSS] , Section 9.5.2.5:
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "SPB 1 Firmware Modifications" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
This procedure is applicable only to FIPS-140-3 certification.
Verify that degraded mode(s) of operation, as defined in FIPS-140-3 , are not implemented.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
This procedure is applicable only to FIPS-140-3 certification.
Verify that the SPB Type 1 inhibits its control output interface during each error state, as specified in FIPS-140-3 .
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
This procedure is applicable only to FIPS-140-3 certification.
Verify that a maintenance role/interface, as defined in FIPS-140-3 , is not implemented.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
This procedure is applicable only to FIPS-140-3 certification.
Verify that, if the SPB Type 1 supports "self-initiated cryptographic output capability," that a User Role and/or Crypto Officer Role is required to support the AuthorityID requirements of DCI-DCSS, 9.4.2.5 .
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
This procedure is applicable only to FIPS-140-3 certification.
Verify the strength and hardness of SPB Type 1 physical security enclosure material(s) are sustained over the SPB Type 1's range of operation, storage, and distribution by review of design documentation. Verify that destructive physical attacks performed on SPB-1 at nominal temperature(s) verified the strength and hardness of SPB Type 1 physical security enclosure material(s).
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
This procedure is applicable only to FIPS-140-3 certification.
Verify that the specified Security Policy maximum time between periodic self-tests, as defined in FIPS-140-3 , is not more than one week.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Like the previous chapter, this chapter contains procedures for evaluating system design for fidelity to DCI requirements that cannot be tested by direct examination of a finished product. These requirements are different though, because they are not proven by the FIPS 140 certification process. The process of proving these requirements is the same, however. Documentation must be produced and Test Subjects must be instrumented to give the examiner all necessary information to evaluate the design. Manufacturers must produce proof in the form of design documentation for each of the appicable requirement specified in this Chapter. To see which requirements are relevant to a particular Test Subject consult the Design Review sections of Part III. Consolidated Test Procedures .)
To complete a compliance evaluation using the requirements in this section, the examiner must be presented with the documentation detailed below. The examiner must also have access to a Test Sample (a production-grade sample of the system, conforming to the operational capabilities of the Design Review sequence being used). Wherever possible, the examiner should confirm that the documentation matches the Test Sample.
For a Type 1 SPB, it should be possible to validate the requirements in this chapter using much of the test material produced for the FIPS 140 test. It may be necessary for the manufacturer to provide additional information in the case where a requirement is not provable using documentation prepared with only the FIPS 140 test in mind. Manufacturers are encouraged to consider the objectives of this chapter when preparing material for the FIPS 140 test of a Type 1 SPB.
The following documents (repeated from Chapter 9 ) are examples of the types of documentation that will be useful when proving compliance with the requirements presented in this chapter:
For a Type 2 SPB, it is necessary to produce documentation to validate the requirements in this chapter. Because a Type 2 SPB is not required to undergo FIPS 140 testing, this documentation will be produced only for the purpose of this DCI compliance test. Note that the documentation need not cover aspects of the design that are not the subject of the requirements.
The following documentation must be supplied:
In addition to the above, any documentation that can be used to prove that the design meets a particular requirement should be provided.
For a Test Subject which implements Forensic Marking (FM), it will be necessary to provide, in addition to the documentation listed above, an intellectual property disclosure statement which describes any claims on intellectual property that the manufacturer intends to make on the FM algorithm.
Each of the subsections below describes a DCI requirement that must be proven by examining the manufacturer's documentation.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The
section
"Security
Devices
Self-Test
Capabilities"
was
deleted.
The
section
number
is
maintained
here
to
announce
failures
and
take
themselves
out
preserve
the
numbering
of
service.
Supporting
Materials
Reference
Documents
DCI-DCSS,
9.4.1
Consolidated
Test
Sequences
Sequence
Type
15.3.
Integrated
IMB
Design
Review
Pass/Fail
20.3.
OMB
Design
Review
Pass/Fail
21.3.
Integrated
IMBO
Design
Review
Pass/Fail
24.3.
SDR
Projector
Design
Review
Pass/Fail
26.3.
HDR
Direct
View
Display
Design
Review
Pass/Fail
27.3.
SDR
Direct
View
Display
Design
Review
Pass/Fail
28.3.
HDR
Projector
Design
Review
Pass/Fail
subsequent
sections.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Does not apply to an SMS that is permanently integrated.
Verify that the SMS communicates with the SM under its control using:
TLS_RSA_WITH_AES_128_CBC_SHA
,
as
defined
in
[RFC-3268]
,
when
TLS
1.0
is
used.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "SM Usage of OS Security Features" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Secure Remote SPB-SM Communications" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The
section
"Playback
Preparation"
was
deleted.
The
section
number
is
maintained
here
to
showtime.
Supporting
Materials
Reference
Documents
DCI-DCSS,
9.4.3.5
Consolidated
Test
Sequences
Sequence
Type
15.3.
Integrated
IMB
Design
Review
Pass/Fail
20.3.
OMB
Design
Review
Pass/Fail
21.3.
Integrated
IMBO
Design
Review
Pass/Fail
preserve
the
numbering
of
subsequent
sections.
The section "Special Auditorium Situation Detection" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The
section
"Prevention
of
Keying
of
Compromised
SPBs"
was
deleted.
The
section
number
is
maintained
here
to
preserve
the
SM
precludes
delivery
numbering
of
keys
or
content
to,
or
play
back
on,
devices
reporting
a
Security
Alert.
Supporting
Materials
Reference
Documents
DCI-DCSS,
9.4.3.5
Consolidated
Test
Sequences
Sequence
Type
15.3.
Integrated
IMB
Design
Review
Pass/Fail
20.3.
OMB
Design
Review
Pass/Fail
21.3.
Integrated
IMBO
Design
Review
Pass/Fail
subsequent
sections.
The section "SPB Authentication" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "TLS Session Key Refreshes" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "LE Key Issuance" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Maximum Key Validity Period" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The
section
"Key
Usage
Time
Window"
was
deleted.
The
section
number
is
started
within
the
KDM
playout
time
window,
but
the
playout
time
window
expires
before
the
end
of
playout.
In
this
case
the
show
may
playout
beyond
maintained
here
to
preserve
the
playout
time
window
by
a
maximum
numbering
of
six
(6)
hours.
Supporting
Materials
Reference
Documents
DCI-DCSS,
9.4.3.5
Consolidated
Test
Sequences
Sequence
Type
15.3.
Integrated
IMB
Design
Review
Pass/Fail
20.3.
OMB
Design
Review
Pass/Fail
21.3.
Integrated
IMBO
Design
Review
Pass/Fail
subsequent
sections
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Verify that in the configuration of a permanently married companion SPB (MB) the companion SPB is not field replaceable and require the Imaging Device SPB and companion SPB system to both be replaced in the event of an SPB failure.
If the companion SPB is a MB with single certificate implementation as defined in Section 9.5.1.1 of [DCI-DCSS] , verify that the system contains exactly one leaf certificate.
If the companion SPB is a MB with dual certificate implementation as defined in Section 9.5.1.2 of [DCI-DCSS] , verify that the system contains exactly two leaf certificates.
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Verify that Type 2 SPB surrounds the following sub-systems:
Verify through physical inspection that a sample device contains the above listed sub-systems in a manner consistent with the documentation.
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Verify that the clock is tamper-proof and thereafter may not be reset.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify
that
all
TLS
end
points
are
within
The
section
"TLS
Endpoints"
was
deleted.
The
section
number
is
maintained
here
to
preserve
the
physical
protection
perimeter
numbering
of
the
associated
SPB.
subsequent
sections.
The section "Implementation of RRPs" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SMS and SPB Authentication and ITM Transport Layer" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Idempotency of ITM RRPs" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify
that
RRP
protocols
are
synchronous:
each
pairing
must
opened
and
closed
before
a
new
RRP
The
section
"RRP
Synchronism"
was
deleted.
The
section
number
is
opened
between
any
two
devices.
Nested
transactions
(in
which
one
end
point
must
communicate
with
another
end
point
while
maintained
here
to
preserve
the
first
waits)
are
allowed.
numbering
of
subsequent
sections.
Verify
that
except
where
noted
in
The
section
"TLS
Mode
Bypass
Prohibition"
was
deleted.
The
section
number
is
maintained
here
to
preserve
the
[DCI-DCSS]
,
non-TLS
security
communications
are
not
used,
and
that
production
Digital
Cinema
security
equipment
has
no
provisions
for
performing
security
functions
in
a
TLS
"bypass"
mode.
numbering
of
subsequent
sections.
The section "RRP Broadcast Prohibition" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify
that
any
proprietary
ITM
implemented
by
equipment
suppliers
do
not
communicate
over
TCP
or
UDP
port
1173,
and
that
such
ITMs
do
not
communicate
information
that
The
section
"Implementation
of
Proprietary
ITMs"
was
deleted.
The
section
number
is
maintained
here
to
preserve
the
subject
numbering
of
any
[SMPTE-430-6]
commands.
subsequent
sections.
The section "RRP Initiator" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB TLS Session Partners" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SM TLS Session Partners" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify
that
unless
otherwise
noted,
an
RRP
response
is
allowed
to
be
busy
or
an
unsupported
message
type
The
section
"RRP
'Busy'
and
that
such
a
response
Unsupported
Types"
was
deleted.
The
section
number
is
not
an
error
event.
maintained
here
to
preserve
the
numbering
of
subsequent
sections.
The section "RRP Operational Messages" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "FM Generic Inserter Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
For a Forensic Marking (FM) embedder:
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that IFM is visually transparent to the critical viewer in butterfly tests for motion image content.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that IFM resists/survives video processing attacks (such as digital-to-analog conversions, including multiple D-A/A-D conversions), re-sampling and re-quantization (including dithering and recompression), common signal enhancements to image contrast and color, resizing, letterboxing, aperture control, low-pass filtering, anti-aliasing, brick wall filtering, digital video noise reduction filtering, frame-swapping, compression, arbitrary scaling (aspect ratio is not necessarily constant), cropping, overwriting, addition of noise and other transformations. Verify that IFM survives collusion (the combining of multiple videos in the attempt to make a different fingerprint or to remove it), format conversion, the changing of frequencies and spatial resolution (among, for example, NTSC, PAL and SECAM, into another and vice versa), horizontal and vertical shifting and camcorder capture and low bit rate compression ( e.g. , 500 Kbps H264, 1.1 Mbps MPEG-1).
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that AFM is inaudible in critical listening A/B tests
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that AFM resists/survives multiple D/A and A/D conversions, radio frequency or infrared transmissions within the theater, any combination and down conversion of captured channels, re-sampling of channels, time compression/ expansion with pitch shift and pitch preserved, linear speed changes within 10% and pitch-invariant time scaling within 4%. Verify that AFM resists/survives data reduction coding, nonlinear amplitude compression, additive or multiplicative noise frequency response distortion such as equalization, addition of echo, band-pass filtering, flutter and wow and overdubbing.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that the SM is solely responsible for control of FM marking processes ( i.e. , "on/off") for the auditorium it is installed in and command and control of this function is only via the KDM indicator per [SMPTE-430-1] .
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "SE Time Stamping" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SE Log Authoring" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify that log records stored in SPBs are stored in non-volatile memory and are not purge-able. Verify that data is overwritten beginning with the oldest data as new log data is accumulated. Verify that no log records are overwritten unless collected by the SM..
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "Remote SPB Log Storage Requirements" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify that the SM is capable of storing at least 12 months of typical log data accumulation for the auditorium in which it is installed.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "Logging for Standalone Systems" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify
that
failure
or
refusal
The
section
"Logging
of
logged
events
Failed
Procedures"
was
deleted.
The
section
number
is
also
a
logged
event
(as
applicable).
maintained
here
to
preserve
the
numbering
of
subsequent
sections.
Verify that behavior of security devices (SPB or SE) is specified and designed to immediately terminate operation, and requires replacement, upon any failure of its secure logging operation.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that resident log records in failed SPBs (and their contained SEs) are not purge-able except by authorized repair centers, which are capable of securely recovering such log records.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that once decrypted from the KDM (and except when being used during playback) content keys are either cached within the secure silicon IC, or protected by AES key wrapping per [NIST-800-38F] when cached externally to secure silicon within the Media Block.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that SPBs of Type 1 are not field serviceable ( e.g. , SPB Type 1 maintenance access doors shall not be open-able in the field).
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that software protection methods are not used to protect CSPs or content essence
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that in the event that Exhibition command and control designs include the TMS as a device that interfaces with the SMs, such a TMS is viewed by the security system as an SMS, and carries a digital certificate and follows all other SMS behavior, TLS and ITM communications requirements.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that the following Digital Cinema Security Parameters (DCSPs) receive SPB Type 1 protection, whenever they exist outside of their originally encrypted state:
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that the mechanism used to generate RSA key pairs must have at least 128-bits of entropy (unpredictability).
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that AES or TDES symmetric keys pre-loaded into a device are generated with a high quality random number generator with at least 128 bits of entropy (112 bits for TDES).
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify
that
the
Media
Decryptor
Block
is
capable
of
securely
caching
at
least
512
keys
keys.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify the following:
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that log records are not purged from a Type 1 SPB in the event of intrusion or other tamper detection.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "ASM Get Time Frequency" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "SPB2 Log Memory Availability" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify that the SPB's Secure Silicon device meets FIPS 140 level 3 "Physical Security" area requirements as defined for "single-chip cryptographic modules". Failure of this verification is cause to fail this test.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Only applies to a Test Subject that is a Companion SPB (SM).
Verify that the Test Subject retrieves the Imaging Device SPB certificate over the marriage connection.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that, when integrated within an Imaging Device as a companion SPB, or permanently married to the Imaging Device, the MB provides 24/7 log recording support, and storage of all log records associated with the Imaging Device SPB.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The following applies only to Test Subjects that are Companion SPBs, i.e. MB designed to operate with an integrated Imaging Device.
Verify that the Test Subject does not operate unless integrated with an Imaging Device SPB. In particular,
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
The section "Standalone MB Single Purpose Requirement" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Verify that the Imaging Device SPB sends log event data across the marriage electrical interface for retention by the companion SPB, as specified in Table 19 of [DCI-DCSS] .
Sequence | Type |
---|---|
24.3. SDR Projector Design Review | Pass/Fail |
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
28.3. HDR Projector Design Review | Pass/Fail |
Verify
that
the
Test
Subject,
for
The
section
"TLS
RSA
Requirement"
was
deleted.
The
section
number
is
maintained
here
to
preserve
the
purpose
numbering
of
ASM
communications,
only
supports
the
TLS
CipherSuite
"TLS_RSA_WITH_AES_128_CBC_SHA"
as
specified
in
[SMPTE-430-6]
.
subsequent
sections.
Only applies if the SM uses dual certificates and the SMS is not permanently integrated.
Verify
that
if
the
Test
Subject's
SMS
establishes
the
TLS
session
with
the
SM
(SM
is
the
TLS
server)
presents
the
SM
Certificate
(SM
Cert)
shall
be
presented
by
to
the
SM.
Verify
that
if
SMS
when
the
Test
Subject's
SM
SMS
establishes
the
a
TLS
session
with
SMS
(SMS
is
the
TLS
server)
the
Log
Signer
Certificate
(LS
Cert)
shall
be
presented
by
the
SM.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that an OMB does not :
Sequence | Type |
---|---|
20.3. OMB Design Review | Pass/Fail |
Verify that under no circumstances does the SM export any KDM-borne key from the SM's SPB.
Sequence | Type |
---|---|
15.3. Integrated IMB Design Review | Pass/Fail |
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
If
the
MB
under
test
can
decrypt
The
section
"Encrypted
Auxiliary
Data
as
defined
by
[SMPTE-429-14]
:
Verify
that
each
such
decryption
takes
place
only
within
the
MB,
and
uses
only
an
MDX1
KeyType
that
Processing"
was
deleted.
The
section
number
is
delivered
within
a
KDM.
Verify
that
the
MB
does
not
process
maintained
here
to
preserve
the
MDX2
KeyType.
Supporting
Materials
Reference
Documents
DCI-DCSS,
9.4.2.7,
9.4.3.6.4
SMPTE-429-14
numbering
of
subsequent
sections.
The section "OBAE Addendum" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Sequence | Type |
---|---|
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Verify that forensic marking applied to OBAE essence is inaudible in critical listening A/B tests.
Sequence | Type |
---|---|
20.3. OMB Design Review | Pass/Fail |
21.3. Integrated IMBO Design Review | Pass/Fail |
Each of the subsections below describes a DCI requirement that shall be verified by examining the manufacturer's documentation.
Verify, according to the following steps, that the display pixel density of the imaging device is equal to or greater than 60 display pixels per degree when viewed at 1.6 screen heights:
The nominal average human vision performance called 20/20 vision corresponds to the ability to resolve 30 cycles per degree. On an imaging device the minimum pixel density that can show 30 cycles per degree is 60 pixels per degree.
Sequence | Type |
---|---|
26.3. HDR Direct View Display Design Review | Pass/Fail |
27.3. SDR Direct View Display Design Review | Pass/Fail |
The chapters in this part contain Test Sequences and a standardized Test Report for testing Digital Cinema equipment. Each Test Sequence subjects a Test Subject to a set of Tests selected from Part I. Procedural Tests and Part II. Design Evaluation Guidelines .
The Test Subject is specified at the opening of each Test Sequence chapter and at Table 11.2 . Each Test Subject comprises a single certificated device .
A Test Report consists of information about a Test Session recorded as specified in Table 11.1 . All fields shall be filled in.
Each certificated device listed on line 9 of a Test Report, with the exception of an SMS, shall be a DCI compliant certificated device as defined below.
NOTE 1: Certificated devices listed on line 9 of a Test Report might have achieved DCI compliance against different versions of the CTP.
EXAMPLE 1: An IMB that achieved DCI-compliance per CTP 1.3 can be used when performing Chapter 24. SDR Projector Consolidated Test Sequence in this version of the CTP.
For a permanently married Imaging Device, line 9 of a Test Report shall identify the Imaging Device and companion device (per [DCI-DCSS] , Section 9.4.3.6.6).
For an integrated SMS, line 9 of a Test Report shall identify the SMS and IMB (per [DCI-DCSS] , Section 9.4.2.5, first bullet).
To assure clarity, line 9 of a Test Report shall include the text "permanently married" or "permanently integrated" as applicable.
1. Reporting date | |
---|---|
2. Test Start Date | |
3. CTP version | |
4. Name of Testing Organization | |
5. Address of Testing Organization | |
6. Name of Test Operator | |
7. Test location (if not at Testing Organization's site) | |
8. Name of Test Subject Representative | |
9. Address of Test Subject Representative | |
10. Make and model of Test Subject (to be included on the DCI listing hot link) | |
11. Serial and model numbers identifying each of the participating certificated devices, including software and/or firmware version numbers as applicable. | |
12. Test sequence performed (select one) |
☐
Chapter
15.
Integrated
IMB
Consolidated
Test
Sequence
☐ Chapter 20. OMB Consolidated Test Sequence ☐ Chapter 21. Integrated IMBO Consolidated Test Sequence ☐ Chapter 24. SDR Projector Consolidated Test Sequence ☐ Chapter 26. HDR Direct View Display Consolidated Test Sequence ☐ Chapter 27. SDR Direct View Display Consolidated Test Sequence ☐ Chapter 28. HDR Projector Consolidated Test Sequence |
13. Test status (select one) |
☐
Pass
☐ Fail |
14. Sequences performed (select one) |
☐
Test
Sequence
and
Design
Review
☐ Confidence Sequence (Confidence Retest) |
A Test Session is the collection of Test Results gathered by subjecting a set of certificated devices to one of the Test Sequences listed in Table 11.2 , with the set of certificated devices including at least those required by the Test Sequence. A certificated device is defined by the combination of its manufacturer, product name and model/version number.
NOTE 2: the set of certificated devices used for a Test Session cannot change, e.g., the same IMB is used for all tests specified in a Test Session against Chapter 24. SDR Projector Consolidated Test Sequence .
NOTE 3: two IMBs with different model/version number are not the same IMB when creating a Test Session.
Test sequence | Test subject | Certificated device that comprise the Test subject | Certificated devices required to perform the Test Sequence |
---|---|---|---|
Chapter 15. Integrated IMB Consolidated Test Sequence | Integrated IMB | IMB | Imaging Device, IMB, SMS |
Chapter 20. OMB Consolidated Test Sequence | Outboard Media Block | OMB | Imaging Device, OMB, IMB, SMS |
Chapter 21. Integrated IMBO Consolidated Test Sequence | Integrated IMBO | IMBO | Imaging Device, IMBO, SMS |
Chapter 24. SDR Projector Consolidated Test Sequence | Projector | Projector | Projector, IMB or IMBO, SMS |
Chapter 26. HDR Direct View Display Consolidated Test Sequence | Direct View Display | Direct View Display | Direct View Display, IMB or IMBO, SMS |
Chapter 27. SDR Direct View Display Consolidated Test Sequence | Direct View Display | Direct View Display | Direct View Display, IMB or IMBO, SMS |
Chapter 28. HDR Projector Consolidated Test Sequence | Projector | Projector | Projector, IMB or IMBO, SMS |
A Test Result is the outcome of evaluating a set of certificated devices against the steps and requirements of a single Test specified in this version of the CTP. Each Test specifies the type of test result generated, either PASS, FAIL or measured data.
A Test Result may, at the discretion of the Testing Organization, be reused across multiple Test Sessions, and thus different Test Sequences, if and only if (a) the underlying test is identical and (b) the set of certificated devices subjected to the Test are identical. The Testing Organization is required to attest that both conditions (a) and (b) are met.
NOTE 4: The above constraints require the supplier(s) of each certificated device to present to the Testing Organization the Test Results to be reused.
EXAMPLE 2: the Test Result obtained by subjecting an SMS to 8.1.1. Storage System Ingest Interface in the course of performing a Test Session against Chapter 15. Integrated IMB Consolidated Test Sequence can be reused when subjecting the same SMS (as defined by its manufacturer, product name and model/version number) to the same test procedure in the course of performing a Test Session against Chapter 21. Integrated IMBO Consolidated Test Sequence .
EXAMPLE 3: the Test Result obtained by subjecting an IMB to 5.1.1. SPB Digital Certificate as specified in CTP 1.2.1 cannot be reused when subjecting the same IMB against the same test procedure in this version of CTP since 5.1.1. SPB Digital Certificate is not identical in both versions of the CTP.
The status of the Test Session is PASS if none of the test results is FAIL; the status of the Test Session is FAIL otherwise.
The certificated device that comprises a Test Subject of a Test Session whose status is PASS is a “DCI compliant certificated device”.
NOTE 5: A “DCI compliant certificated device” is defined in the context of a specific Test Session, i.e., for a particular set of certificated devices and set of tests defined in a specific version of the CTP. As such, while the certificated device remains DCI-compliant in perpetuity within that context, not all Test Results associated with the DCI compliant certificated device can necessarily be reused in all future Test Sessions in which the certificated device is involved, as described in this section.
An entity may aggregate some of its products that only have different NPRC s into a family group by attesting to the similarity of that family group in a letter, signed by a person who has authority to bind the entity under test to the terms of said letter. The letter shall be sent to the contracted licensed test entity and to DCI at dci.info@dcimovies.com.
The letter, which will be attached to the detailed report of the tested device, will clearly and comprehensively provide detailed information and justification as described below:
Further, the form specified in Table 11.3 shall be attached to the letter to identify the general family group information. This form will be published with the summary report of the tested device on the DCI Compliant Family Groups web site page.
Manufacturer | ||
---|---|---|
Testament Submission Date | ||
Equipment Tested | ||
Family Item 0 | Make | |
Model | ||
Version | ||
Family Item 1 | Make | |
Model | ||
Version | ||
Family Item 2 | Make | |
Model | ||
Version | ||
(add items as needed) | ||
Printed Name | ||
Job Title | ||
Signature |
If any NPRC s fail within a Test Subject during CTP testing, the components may be replaced with equivalent units (having identical technical specifications) and testing resumed from a test point before any anomalous behavior was first observed. In order to assure compliance, the DCI licensed entity will determine the appropriate resumption point and continue testing all procedures thereafter in numerical order.
If any programmable components (e.g., FPGAs) fail within a Test Subject during CTP testing and it can be shown rigorously by the manufacturer that a completely identical component with identical programming is available (same component, version and programming), the programmable component may be replaced with the identical component and testing resumed from a test point before any anomalous behavior was first observed. In order to assure compliance, the DCI licensed entity will determine the appropriate resumption point and continue testing all procedures thereafter in numerical order.
It is recognized that when an SPB-1 or SPB-2 is replaced, the private key will necessarily change. However, the certificate formulation (e.g., digest method, subject name component values (O, OU), subject name CN roles, RSA, key length, extensions, etc.) will need to be identical to that of the certificate in the original SPB so as to result in the same system behavior.
Any device that fails during CTP testing that is connected to the Test Subject may be repaired and then verified as to its correct behavior. If correct behavior is verified, testing may resume from a test point before any anomalous behavior was first observed.
Once compliance has been established for a particular device model per this specification, all product changes shall maintain compliance with the [DCI-DCSS] and this specification. Product changes may require retesting to ensure continued compliance. For the avoidance of doubt, no retesting is required if there are no product changes.
Changes of any kind may require retesting of the SPB-1 device for FIPS 140 compliance by a Cryptographic Module Validation Program (CMVP) testing laboratory. The extent of retesting of FIPS compliance shall be at the determination of the CMVP testing laboratory. Device manufacturers shall notify DCI at dci.info@dcimovies.com of all SPB-1 upgrades prior to deployment, identifying all relevant component version numbers.
Replacing NPRC s with technically equivalent components will not require retesting.
All changes to software and firmware shall maintain compliance with the [DCI-DCSS] and this specification. Device manufacturers shall notify DCI at dci.info@dcimovies.com of all upgrades prior to deployment, identifying all relevant component version numbers.
Changes to software or firmware shall require Confidence Retesting by a licensed DCI test entity on a “three-year or four-upgrade cycle” if either
Any Confidence Retest of updated software or firmware shall be conducted against the version of this specification in force as of the date the device is submitted for Confidence Retest, subject to the following exceptions.
To minimize barriers to upgrades, SPB devices that are already listed on the Compliant Equipment page of the DCI website may be excused from CTP tests associated with the bullets below for purposes of a Confidence Retest:
Confidence Retests have been selected to assure that software/firmware changes have not impacted critical security functionality. Hardware, integrated circuit, FIPS related issues, etc., are addressed by Sections 11.2.4.2. SPB Type 1 (SPB-1) Devices and 11.2.4.3. NPRC .
The chapter "Digital Cinema Package (DCP) Consolidated Test Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Digital Cinema Server Consolidated Test Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Standalone D-Cinema Projector Consolidated Test Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The test sequence defined in this chapter is intended to be used to test a integrated Image Media Block (IMB) as the Test Subject. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Projector), an Image Media Block (containing a Security Manager, Media Decryptor, etc.), and a Screen Management Server (SMS). For the purpose of this test, the Test Operator may substitute a Theater Management Server (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the projector shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements for a Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
The chapter "Link Decryptor/Encryptor Consolidated Test Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Digital Cinema Server Consolidated Confidence Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Standalone D-Cinema Projector Consolidated Test Sequence" was deleted. The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Integrated IMB Consolidated Confidence Sequence" was moved to 15.4. Integrated IMB Confidence Sequence . The chapter number is maintained here to preserve the numbering of subsequent sections.
The test sequence defined in this chapter is intended to be used to test an Outboard Media Block (OMB) as the Test Subject. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least an OMB, Projector, IMB and SMS. For the purpose of this test, the Test Operator may substitute a Theater Management Server/System (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
Digital cinema systems that include an OMB operate in Multiple Media Block (MMB) mode, wherein the SMS is responsible for managing playout processes of the OMB and IMB, and the IMB provides synchronization to the OMB. The IMB must also be able to play only a portion of the total content in a composition, as the OMB will be handling some of the content. Thus, the IMB and SMS must be "MMB Capable" to function within a MMB architecture. This Chapter contains specific tests for the IMB and SMS to test for this capability.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements for a Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
The test sequence defined in this chapter is intended to be used to test an integrated Image Media Block with OMB functions (IMBO) as the Test Subject. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Projector), an IMBO (containing a Security Manager, Media Decryptors, image, main sound and OBAE sound processing, etc.), and a Screen Management Server/System (SMS). For the purpose of this test, the Test Operator may substitute a Theater Management Server/System (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the projector shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements for a Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
The chapter "OMB Consolidated Confidence Sequence" was moved to 20.4. OMB Confidence Sequence . The chapter number is maintained here to preserve the numbering of subsequent sections.
The chapter "Integrated IMBO Consolidated Confidence Sequence" was moved to 21.4. Integrated IMBO Confidence Sequence . The chapter number is maintained here to preserve the numbering of subsequent sections.
The Test Subject of the Test Sequence defined in this chapter is a Projector that supports SDR presentations. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Projector), an IMB or IMBO (containing a Security Manager, Media Decryptor, etc.), and an SMS. For the purpose of this test, the Test Operator may substitute a Theater Management Server/System (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the projector shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements for a Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status
Step | Procedure | Pass | Fail | Exhibit Identifiers |
---|---|---|---|---|
1 | 10.4.1. Theater System Reliability | |||
2 |
|
|||
|
10.4.19. Access to Imaging Device Image Signals | |||
|
10.4.20. Systems with Electronic Marriage | |||
|
10.4.21. Systems Without Electronic Marriage | |||
|
10.4.25. Repair and Renewal of SPBs | |||
|
10.4.26. SPB2 Protected Devices | |||
|
10.4.78. Imaging Device SPB Log Reporting Requirements |
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
Step | Procedure | Pass | Fail | Measured data |
---|---|---|---|---|
1 | 5.1.1. SPB Digital Certificate | |||
2 | 6.1.20. Validity of SPB Certificates | |||
3 | 6.4.3. FM Payload | |||
4 | 7.2.2. Projector and Direct View Display Security Servicing | |||
5 | 7.5.3. Imaging Device Pixel Count/Structure |
The chapter "Projector Consolidated Confidence Sequence" was moved to 24.4. SDR Projector Confidence Sequence . The chapter number is maintained here to preserve the numbering of subsequent sections.
The Test Subject of the Test Sequence defined in this chapter is a Direct View Display that supports both SDR and HDR presentations. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Direct View Display), an IMB or IMBO (containing a Security Manager, Media Decryptor, etc.), and a Screen Management Server (SMS). For the purpose of this test, the Test Operator may substitute a Theater Management Server (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the direct view display shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements fora Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status.
Step | Procedure | Pass | Fail | Exhibit Identifiers |
---|---|---|---|---|
1 | 10.4.1. Theater System Reliability | |||
2 |
|
|||
|
10.4.19. Access to Imaging Device Image Signals | |||
|
10.4.20. Systems with Electronic Marriage | |||
|
10.4.21. Systems Without Electronic Marriage | |||
|
10.4.25. Repair and Renewal of SPBs | |||
|
10.4.26. SPB2 Protected Devices | |||
|
10.4.78. Imaging Device SPB Log Reporting Requirements | |||
|
10.5.1. Pixel Visibility (Direct View Display) |
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
Step | Procedure | Pass | Fail | Measured data |
---|---|---|---|---|
1 | 5.1.1. SPB Digital Certificate | |||
2 | 6.1.20. Validity of SPB Certificates | |||
3 | 6.4.3. FM Payload | |||
4 | 7.2.2. Projector and Direct View Display Security Servicing | |||
5 | 7.5.3. Imaging Device Pixel Count/Structure |
The Test Subject of the Test Sequence defined in this chapter is a Direct View Display that supports SDR presentations. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Direct View Display), an IMB or IMBO (containing a Security Manager, Media Decryptor, etc.), and a Screen Management Server (SMS). For the purpose of this test, the Test Operator may substitute a Theater Management Server (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the direct view display shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements fora Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status.
Step | Procedure | Pass | Fail | Exhibit Identifiers |
---|---|---|---|---|
1 | 10.4.1. Theater System Reliability | |||
2 |
|
|||
|
10.4.19. Access to Imaging Device Image Signals | |||
|
10.4.20. Systems with Electronic Marriage | |||
|
10.4.21. Systems Without Electronic Marriage | |||
|
10.4.25. Repair and Renewal of SPBs | |||
|
10.4.26. SPB2 Protected Devices | |||
|
10.4.78. Imaging Device SPB Log Reporting Requirements | |||
|
10.5.1. Pixel Visibility (Direct View Display) |
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
Step | Procedure | Pass | Fail | Measured data |
---|---|---|---|---|
1 | 5.1.1. SPB Digital Certificate | |||
2 | 6.1.20. Validity of SPB Certificates | |||
3 | 6.4.3. FM Payload | |||
4 | 7.2.2. Projector and Direct View Display Security Servicing | |||
5 | 7.5.3. Imaging Device Pixel Count/Structure |
The Test Subject of the Test Sequence defined in this chapter is a Projector that supports both SDR and HDR presentations. The configuration and architecture of the system may vary, but the test sequence requires that the system consists of at least a light processing system including electronic and optical components (Projector), an IMB or IMBO (containing a Security Manager, Media Decryptor, etc.), and an SMS. For the purpose of this test, the Test Operator may substitute a Theater Management Server/System (TMS) for the SMS if it implements the required functionality. Wherever a test procedure refers to an SMS, the equivalent TMS may also be used.
For the purpose of compliance testing as defined in this Chapter, the spatial resolution of the projector shall be no less than that of the Media Block.
Before performing the test sequence provided below, the Test Operator should read and understand the documentation provided with the Test Subject. If adequate documentation is not available, a Test Subject Representative should be available to provide assistance during the test session.
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
For each requirement listed in the table below, prove that the system design meets the requirement by identifying the software or hardware mechanism that implements the requirement and analyzing the design to assure that the requirement has been met, subject to stipulated conditions. If a proof cannot be made, the design will be considered non-compliant with regard to the requirement. To perform this analysis the examiner will require access to exhibit documents (system design artifacts) such as schematic diagrams, implementation source code, unit test source code, state diagrams, design notes, etc. See Chapter 9: FIPS Requirements for a Type 1 SPB and Chapter 10: DCI Requirements Review for more information.
For each requirement, the examiner must record the identifiers of the exhibits consulted in proving the requirement, including applicable version identifiers, section or sheet numbers, grid identifiers, etc., and the examiner must record Pass or Fail to indicate whether or not the requirement has been met by the design. The examiner may also record any notes relevant to interpreting the exhibits and to the determination of the compliance status
Step | Procedure | Pass | Fail | Exhibit Identifiers |
---|---|---|---|---|
1 | 10.4.1. Theater System Reliability | |||
2 |
|
|||
|
10.4.19. Access to Imaging Device Image Signals | |||
|
10.4.20. Systems with Electronic Marriage | |||
|
10.4.21. Systems Without Electronic Marriage | |||
|
10.4.25. Repair and Renewal of SPBs | |||
|
10.4.26. SPB2 Protected Devices | |||
|
10.4.78. Imaging Device SPB Log Reporting Requirements |
For each row of the table below, perform the procedure specified in the Procedure column, subject to all conditions specified in the Condition column. Indicate the status of the test in the Pass or Fail column, unless the test is specified as data only . Any marks in greyed-out fields indicate a test failure. Report any information listed in the Measured Data column. The Test Operator may record any additional observations.
Step | Procedure | Pass | Fail | Measured data |
---|---|---|---|---|
1 | 5.1.1. SPB Digital Certificate | |||
2 | 6.1.20. Validity of SPB Certificates | |||
3 | 6.4.3. FM Payload | |||
4 | 7.2.2. Projector and Direct View Display Security Servicing | |||
5 | 7.5.3. Imaging Device Pixel Count/Structure |
To facilitate consistent testing of d-cinema equipment, a set of reference files has been produced to be used as directed in the respective test procedures. These materials are described in detail in this Appendix with the intention that the materials can be re-created from the descriptions and used to achieve testing results equivalent to those achieved with the original reference files.
The test material described below consists of digital certificates, Key Delivery Messages (KDM) and D-Cinema Packages (DCP). A DCP can be further deconstructed as a set of Track Files, Composition Playlists and related file descriptions. Some Track Files will be encrypted.
Because the identity of a Test Subject cannot be known until the device has been manufactured, it is not possible to create reference KDM files in advance. It is therefore necessary to divide the test material into two categories: common-use reference material and per-device reference material. Common-use reference material can be created once and used without limit on any compliant system. Per-device reference material must be created for each Test Subject, with foreknowledge of the date and time of the test session.
Two additional categories of reference material exist: compliant and intentionally non-compliant. Most of the material will be "golden" reference files, intended to be entirely compliant with the relevant specifications. Other files, however, will be intentionally broken to allow testing of error detection and recovery mechanisms.
This section defines a set of MXF picture track files. For each track file, a description is given which details the images encoded in the file. The image track files will be combined with sound files to make complete compositions (see Section A.4 ).
The section "2K FM Control Granularity Begin" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "2K FM Control Granularity End" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "DCI_gradient_step_s_color_j2c_pt" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Timed Text Example with Font" was deleted. The section number is maintained here to preserve the numbering of subsequent sections
The section "Timed Text Example with PNG" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Plain_Frame_nosub_j2c_ct" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The
section
"Sync
Count
,
but
with
KDM-borne
KDM-Borne
MIC
Keys.
Conforms
Key"
was
deleted.
The
section
number
is
maintained
here
to
SMPTE-377-1
SMPTE-422
SMPTE-429-3
SMPTE-429-4
preserve
the
numbering
of
subsequent
sections.
Step Number | X′ | Y′ | Z′ |
---|---|---|---|
1 | 269 | 274 | 283 |
2 | 270 | 275 | 284 |
3 | 271 | 276 | 285 |
4 | 272 | 277 | 286 |
5 | 273 | 278 | 287 |
6 | 274 | 279 | 288 |
7 | 275 | 280 | 289 |
8 | 276 | 281 | 290 |
9 | 277 | 282 | 292 |
10 | 278 | 283 | 292 |
11 | 279 | 284 | 293 |
12 | 279 | 285 | 295 |
13 | 280 | 286 | 296 |
14 | 281 | 287 | 297 |
15 | 282 | 288 | 298 |
16 | 283 | 289 | 299 |
17 | 284 | 290 | 300 |
Step Number | X″ | Y″ | Z″ |
---|---|---|---|
1 | 177 | 181 | 188 |
2 | 178 | 182 | 189 |
3 | 179 | 183 | 190 |
4 | 180 | 184 | 191 |
5 | 181 | 185 | 192 |
6 | 182 | 186 | 193 |
7 | 183 | 187 | 194 |
8 | 184 | 188 | 195 |
9 | 185 | 189 | 196 |
10 | 186 | 190 | 197 |
11 | 187 | 191 | 198 |
12 | 188 | 192 | 199 |
13 | 189 | 193 | 200 |
14 | 190 | 194 | 201 |
15 | 191 | 195 | 203 |
16 | 192 | 196 | 204 |
17 | 193 | 197 | 205 |
This section defines a set of MXF sound track files. For each track file, a description is given which details the sounds encoded in the file. The sound track files will be combined with image files to make complete compositions (see Section A.4 ).
The section "Pink Noise, 16 Channels, 96 kHz (Encrypted)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "400 hz sine wave (Encrypted)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "FM StEM 5.1 Sound (Encrypted)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
c5
9a
f6
6f
bd
e0
70
39
ba
36
2c
62
e8
21
e6
02
.
The
section
"Sync
Count
5.1
but
with
KDM-borne
KDM-Borne
MIC
Keys.
Conforms
Key"
was
deleted.
The
section
number
is
maintained
here
to
SMPTE-377-1
SMPTE-429-3
SMPTE-382
preserve
the
numbering
of
subsequent
sections.
The
section
"OBAE
Tone
Multi-Reel
Conforms
Multi-Reel"
was
deleted.
The
section
number
is
maintained
here
to
SMPTE-377-1
SMPTE-429-3
SMPTE-429-6
SMPTE-382
SMPTE-430-12
SMPTE-429-19
preserve
the
numbering
of
subsequent
sections.
The
section
"Main
Sound
for
OBAE
Tone
Multi-Reel
Conforms
Multi-Reel"
was
deleted.
The
section
number
is
maintained
here
to
SMPTE-377-1
SMPTE-429-3
SMPTE-429-6
SMPTE-382
preserve
the
numbering
of
subsequent
sections.
The section "Audio Tone Multi-Reel" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
This section defines a set of D-Cinema Compositions and D-Cinema Packages. The Compositions depend upon the track files created in Section A.2 and Section A.3 . The Packages contain the Compositions for ingest.
The section "Multi-line Subtitle Test" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Multi-line PNG Subtitle Test" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Subtitle Test Part 1" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Subtitle Test Part 2" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "Subtitle Test Part 3" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "DCI 2K Moving Gradient" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "DCI DCP 2K (Encrypted)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
Six certificate chains are defined, which separate certificates by device type and level of conformity. In the descriptions below, the IMB label refers to a certificate which contains roles for a Media Block (MB) or a certificate which signs such certificates. Similarly, PRJ refers to certificates or signers associated with an Imaging Device and KDS refers to certificates associated with a Key Distribution System (a KDM authoring entity).
Contents removed, not used by any procedure
Contents removed, not used by any procedure
The KDM files defined in this section must be generated for the Test Subject and the time and date of the test procedure.
ForensicMarkFlag
element
with
the
value
http://www.dcimovies.com/430-1/2006/KDM#mrkflg-audio-disable-above-channel-06
,
and
one
ForensicMarkFlag
element
with
the
value
http://www.dcimovies.com/430-1/2006/KDM#mrkflg-audio-disable-above-channel-08
.
The section "KDM with the projector and LDB on the TDL" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with the LDB alone on the TDL" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with imminent expiration date" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
http://www.smpte-qa.org/schemas/430-3/2001/ETM
shall
be
used.
The section "KDM for 2K StEM with Device Specific Special Auditorium TDL" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM for DCI 2K StEM with a TDL that contains all of the certificate thumbprints for the devices in the special auditorium situation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL including Responder A" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL including Responder B" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL that contains all of the certificate thumbprints for the devices in the special auditorium situation and an additional device certificate" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL that contains all but one of the certificate thumbprints for the devices in the special auditorium situation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL that contains all of the certificate thumbprints for the devices in the special auditorium situation and the 'assume trust' thumbprint" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL that contains one more LD/LE device thumbprints than there are LD/projector thumbprints in the special auditorium situation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with Assume Trust TDL Entry for DCI 2K Sync Test (Encrypted)" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The section "KDM with a TDL that contains all of the certificate thumbprints for the devices in the special auditorium situation" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
KeyType
value.
KeyType
value
"MDAK"
.
http://www.smpte-qa.org/schemas/430-3/2001/ETMshall
be
used.
Wherever possible, the computer programs used in the test procedures in this document are freely available. Where appropriate, the listings in Appendix B provide a URL where the software can be obtained.
In some cases, it was necessary to develop programs because free alternatives were not available. Those programs are presented here in source code form along with instructions for building and executing the programs.
The
programs
are
expressed
in
the
C,
C++
and
Python
programming
languages.
Build
instructions
and
prerequisites
for
the
C
and
C++
programs
are
given
in
the
comments
at
the
beginning
of
each
source
module.
Machine
readable
copies
of
the
programs
are
available
in
the
source-code
directory
in
the
Reference
Materials
distribution
(see
Appendix
A
).
This
program
reads
a
PEM
formatted
X509
certificate
and
calculates
a
SHA-1
message
digest
over
the
signed
portion
of
the
certificate
as
required
by
[SMPTE-430-2]
.
The
value
is
encoded
as
a
Base64
string
and
returned
on
stdout
.
The
following
example
illustrates
this
usage:
$ dc-thumbprint my-cert.pem aZMVnZ/TzEvLUCmQFcc8U0je9uo=
/* * dc-thumbprint.c -- calculate certificate thumbprint of PEM-encoded * X.509 document per SMPTE 430-2 * * $Id$ * * This program requires OpenSSL. To build: * $ cc -o dc-thumbprint dc-thumbprint.c -lcrypto */ #include <errno.h> #include <stdio.h> #include <string.h> #include <openssl/crypto.h> #include <openssl/err.h> #include <openssl/evp.h> #include <openssl/pem.h> #include <openssl/x509.h> int main(int argc, char* argv[]) { /* pointer to SHA-1 hash details */ const EVP_MD *md = EVP_sha1(); /* PEM source file pointer */ FILE *fp = NULL; /* pointer to an X509 structure */ X509 *x = NULL; /* pointer to DER-encoded TBSCertificate */ unsigned char *p_tbs = NULL; /* length of DER-encoded TBSCertificate (p_tbs) */ int tbs_len = 0; /* buffer for the message digest */ unsigned char md_value[EVP_MAX_MD_SIZE]; /* return value from digest calculation */ int digest_rc = 0; /* buffer for base64 encoding of the message digest */ char md_base64[EVP_MAX_MD_SIZE * 4 / 3 + 2]; if (argc != 2) { fprintf(stderr, "Usage: dc-thumbprint cert-file.pem\n"); return 1; } fp = fopen(argv[1], "r"); if (fp == NULL) { fprintf(stderr, "ERROR: Cannot open %s: %s\n", argv[1], strerror(errno)); return 2; } x = PEM_read_X509(fp, NULL, NULL, NULL); (void) fclose(fp); if (x == NULL) { ERR_print_errors_fp(stderr); return 3; } /* get the tbsCertificate as a DER string */ tbs_len = i2d_re_X509_tbs(x, &p_tbs); X509_free(x); if (tbs_len <= 0) { ERR_print_errors_fp(stderr); return 4; } /* perform the message digest */ digest_rc = EVP_Digest(p_tbs, tbs_len, md_value, NULL, md, NULL); OPENSSL_free(p_tbs); if (digest_rc == 0) { ERR_print_errors_fp(stderr); return 5; } /* perform the base64 encoding */ (void) EVP_EncodeBlock((unsigned char *)md_base64, md_value, EVP_MD_meth_get_result_size(md)); printf("%s\n", md_base64); return 0; } /* * end dc-thumbprint.c */
This program parses and validates XML instance documents. When an XML document is specified alone, the file is checked for well-formedness but is not validated. When an XML document is specified with one or more schema files, schema-check validates that file against the schemas. Only one file to be tested may be specified at a time. Note that schema files must be listed in order of dependency (most dependent last). The following example illustrates using the program to check well-formedness:
$ schema-check perfect-movie.cpl.xml
The next example shows how to use the program to check for valid content:
$ schema-check perfect-movie.cpl.xml SMPTE-428-7.xsd
// // schema-check.cpp -- test XML document against schema // // $Id$ // // This program requires the Xerces-c XML library. To build: // $ c++ -o schema-check schema-check.cpp -lxerces-c // #include <iostream> #include <list> #include <string> #include <cstdio> #include <xercesc/util/OutOfMemoryException.hpp> #include <xercesc/dom/DOM.hpp> #include <xercesc/parsers/XercesDOMParser.hpp> #include <xercesc/framework/XMLGrammarDescription.hpp> #include <xercesc/sax/ErrorHandler.hpp> #include <xercesc/sax/SAXParseException.hpp> using std::cerr; using std::endl; XERCES_CPP_NAMESPACE_USE // --------------------------------------------------------------------------- // Utility code adapted from the DOMPrint program distributed with Xerces-c // simple transcoding wrapper class StrX { char* fLocalForm; public : StrX(const XMLCh* const toTranscode) { fLocalForm = XMLString::transcode(toTranscode); } ~StrX() { XMLString::release(&fLocalForm); } const char* localForm() const { return fLocalForm; } }; std::ostream& operator<<(std::ostream& target, const StrX& toDump) { target << toDump.localForm(); return target; } // error handler interface class DOMTreeErrorReporter : public ErrorHandler { public: void warning(const SAXParseException& toCatch) {} void resetErrors() {} void error(const SAXParseException& toCatch) { cerr << "Error at file \"" << StrX(toCatch.getSystemId()) << "\", line " << toCatch.getLineNumber() << ", column " << toCatch.getColumnNumber() << endl << " Message: " << StrX(toCatch.getMessage()) << endl; } void fatalError(const SAXParseException& toCatch) { cerr << "Fatal Error at file \"" << StrX(toCatch.getSystemId()) << "\", line " << toCatch.getLineNumber() << ", column " << toCatch.getColumnNumber() << endl << " Message: " << StrX(toCatch.getMessage()) << endl; } }; // --------------------------------------------------------------------------- int main(int argc, const char** argv) { try { XMLPlatformUtils::Initialize(); } catch(const XMLException& e) { StrX tmp_e(e.getMessage()); cerr << "Xerces initialization error: " << tmp_e.localForm() << endl; return 2; } // check command line for arguments if ( argc < 2 ) { cerr << "usage: schema-check <xml-file> [<schema-file> ...]" << endl; return 3; } for ( int i = 1; i < argc; i++ ) { FILE *f = fopen(argv[i], "r"); if ( f == 0 ) { perror(argv[i]); return 4; } } XercesDOMParser *parser = new XercesDOMParser; DOMTreeErrorReporter *errReporter = new DOMTreeErrorReporter(); parser->setErrorHandler(errReporter); parser->setDoNamespaces(true); parser->setCreateEntityReferenceNodes(true); parser->useCachedGrammarInParse(true); if ( argc > 2 ) { parser->setDoSchema(true); parser->setValidationScheme(AbstractDOMParser::Val_Always); parser->setValidationSchemaFullChecking(true); for ( int i = 2; i < argc; i++ ) { if ( parser->loadGrammar(argv[i], Grammar::SchemaGrammarType, true) == 0 ) { cerr << "Error loading grammar " << std::string(argv[i]) << endl; return 4; } } } bool errorsOccured = true; try { parser->parse(argv[1]); errorsOccured = false; } catch ( const OutOfMemoryException& ) { cerr << "Out of memory exception." << endl; } catch ( const XMLException& e ) { cerr << "An error occurred during parsing" << endl << " Message: " << StrX(e.getMessage()) << endl; } catch ( const DOMException& e ) { const unsigned int maxChars = 2047; XMLCh errText[maxChars + 1]; cerr << endl << "A DOM error occurred during parsing: '" << std::string(argv[1]) << "'" << endl << "DOM Exception code: " << e.code << endl; if ( DOMImplementation::loadDOMExceptionMsg(e.code, errText, maxChars) ) cerr << "Message is: " << StrX(errText) << endl; } catch (...) { cerr << "An unclassified error occurred during parsing." << endl; } return errorsOccured ? 1 : 0; } // // end schema-check.cpp //
This
program
reads
a
KDM
and
an
RSA
private
key
in
PEM
format
and
decrypts
the
EncryptedKey
elements
in
the
KDM.
The
decrypted
key
blocks
are
printed
to
stdout
.
Note
that
key
blocks
in
the
KDM
must
have
been
encrypted
using
the
public
key
which
corresponds
to
the
RSA
key
given
as
the
second
argument
to
this
program.
$ kdm-decrypt test_file.kdm.xml my_id_key.pem CipherDataID: f1dc124460169a0e85bc300642f866ab SignerThumbprint: q5Oqr6GkfG6W2HzcBTee5m0Qjzw= CPL Id: 119d8990-2e55-4114-80a2-e53f3403118d Key Id: b6276c4b-b832-4984-aab6-250c9e4f9138 Key Type: MDIK Not Before: 2007-09-20T03:24:53-00:00 Not After: 2007-10-20T03:24:53-00:00 Key Data: 7f2f711f1b4d44b83e1dd1bf90dc7d8c
// // kdm-decrypt.cpp -- decrypt and display KDM EncryptedKey elements // // $Id$ // // This program requires the Xerces-c XML, XMLSecurity, OpenSSL // and asdcplib libraries. To build: // // c++ -o kdm-decrypt kdm-decrypt.cpp // -lxerces-c -lxml-security-c -lkumu -lcrypto // #include <KM_util.h> #include <KM_fileio.h> #include <ctype.h> #include <iostream> #include <string> #include <openssl/pem.h> #include <xercesc/util/OutOfMemoryException.hpp> #include <xercesc/parsers/XercesDOMParser.hpp> #include <xercesc/framework/MemBufInputSource.hpp> #include <xsec/framework/XSECProvider.hpp> #include <xsec/framework/XSECException.hpp> #include <xsec/enc/XSECCryptoException.hpp> #include <xsec/enc/OpenSSL/OpenSSLCryptoKeyRSA.hpp> XERCES_CPP_NAMESPACE_USE using std::cout; using std::cerr; using std::endl; using namespace Kumu; const size_t KeyType_Length = 4; const size_t DateTime_Length = 25; const ui32_t X509Thumbprint_Length = 20; // A structure to hold key block data retrieved during a decrypt operation. struct S430_2_KeyBlock { byte_t CipherDataID[UUID_Length]; byte_t SignerThumbprint[X509Thumbprint_Length]; byte_t CPLId[UUID_Length]; byte_t KeyType[KeyType_Length]; byte_t KeyId[UUID_Length]; byte_t NotBefore[DateTime_Length]; byte_t NotAfter[DateTime_Length]; byte_t KeyData[SymmetricKey_Length]; S430_2_KeyBlock() { memset(this, 0, sizeof(S430_2_KeyBlock)); } std::string Dump() const; }; std::string safe_char(char c) { char b[2] = {'*', 0}; if ( isprint(c) ) b[0] = c; return b; } // Pretty-print key block data. std::string S430_2_KeyBlock::Dump() const { using std::string; Kumu::Identifier<X509Thumbprint_Length> TmpThumbprint; UUID TmpUUID; char tmp_buf[64]; string out_string; bin2hex(CipherDataID, UUID_Length, tmp_buf, 64); out_string = " CipherDataID: " + string(tmp_buf); TmpThumbprint.Set(SignerThumbprint); out_string += "\nSignerThumbprint: " + string(TmpThumbprint.EncodeBase64(tmp_buf, 64)); TmpUUID.Set(CPLId); out_string += "\n CPL Id: " + string(TmpUUID.EncodeHex(tmp_buf, 64)); TmpUUID.Set(KeyId); out_string += "\n Key Id: " + string(TmpUUID.EncodeHex(tmp_buf, 64)); out_string += "\n Key Type: " + safe_char(KeyType[0]) + safe_char(KeyType[1]) + safe_char(KeyType[2]) + safe_char(KeyType[3]); assert(DateTime_Length<64); tmp_buf[DateTime_Length] = 0; memcpy(tmp_buf, NotBefore, DateTime_Length); out_string += "\n Not Before: " + string(tmp_buf); memcpy(tmp_buf, NotAfter, DateTime_Length); out_string += "\n Not After: " + string(tmp_buf); bin2hex(KeyData, UUID_Length, tmp_buf, 64); out_string += "\n Key Data: " + string(tmp_buf); out_string += "\n"; return out_string; } // Given a KDM string and a parsed RSA key, decrypt the key blocks // in the KDM and print them to stdout. int decrypt_kdm(const std::string& KDMDocument, EVP_PKEY* Target) { assert(Target); XercesDOMParser* parser = new XercesDOMParser; parser->setDoNamespaces(true); parser->setCreateEntityReferenceNodes(true); try { MemBufInputSource xmlSource(reinterpret_cast<const XMLByte*>(KDMDocument.c_str()), static_cast<XMLSize_t>(KDMDocument.length()), "pidc_rules_file"); parser->parse(xmlSource); int errorCount = parser->getErrorCount(); if ( errorCount > 0 ) { cerr << "XML parse errors: " << errorCount << endl; return -1; } } catch ( const OutOfMemoryException& ) { cerr << "Out of memory exception." << endl; } catch ( const XMLException& e ) { char* emsg = XMLString::transcode(e.getMessage()); cerr << "An error occurred during parsing" << endl << " Message: " << emsg << endl; XSEC_RELEASE_XMLCH(emsg); } catch ( const DOMException& e ) { const unsigned int maxChars = 2047; XMLCh errText[maxChars + 1]; cerr << endl << "DOM Exception code is: " << e.code << endl; if ( DOMImplementation::loadDOMExceptionMsg(e.code, errText, maxChars) ) { char* emsg = XMLString::transcode(errText); cerr << "Message is: " << emsg << endl; XSEC_RELEASE_XMLCH(emsg); } } catch (...) { cerr << "Unexpected XML parser error." << endl; } try { XSECProvider prov; OpenSSLCryptoKeyRSA* PrivateKey = new OpenSSLCryptoKeyRSA(Target); if ( PrivateKey == 0 ) { cerr << "Error reading private key" << endl; return -1; } DOMDocument* doc = parser->getDocument(); assert(doc); XENCCipher* cipher = prov.newCipher(doc); cipher->setKEK(PrivateKey); DOMNodeIterator* Iter = ((DOMDocumentTraversal*)doc)->createNodeIterator(doc, (DOMNodeFilter::SHOW_ELEMENT), 0, false); assert(Iter); DOMNode* Node; int keys_accepted = 0; int key_nodes_found = 0; while ( (Node = Iter->nextNode()) != 0 ) { char* n = XMLString::transcode(Node->getLocalName()); if ( n == 0 ) continue; if ( strcmp(n, "EncryptedKey") == 0 ) { key_nodes_found++; S430_2_KeyBlock CipherData; ui32_t decrypt_len = cipher->decryptKey((DOMElement*)Node, (byte_t*)&CipherData, sizeof(CipherData)); if ( decrypt_len == sizeof(CipherData) ) { keys_accepted++; cout << CipherData.Dump(); } else if ( decrypt_len > 0 ) cerr << "Unexpected cipher block length: " << decrypt_len << endl; else cerr << "Error decoding key block: " << key_nodes_found << endl; } XSEC_RELEASE_XMLCH(n); } Iter->release(); } catch (XSECException &e) { char* emsg = XMLString::transcode(e.getMsg()); cerr << "Key decryption error: " << emsg << endl; XSEC_RELEASE_XMLCH(emsg); return -1; } catch (XSECCryptoException &e) { cerr << "Crypto error: " << e.getMsg() << endl; return -1; } catch (...) { cerr << "Unexpected decryption error." << endl; } delete parser; return 0; } // int main(int argc, const char** argv) { if ( argc < 3 ) { cerr << "USAGE: kdm-decrypt <kdm-file> <RSA-PEM-file>" << endl; return 2; } try { XMLPlatformUtils::Initialize(); XSECPlatformUtils::Initialise(); } catch(const XMLException& e) { char* emsg = XMLString::transcode(e.getMessage()); cerr << "Xerces or XMLSecurity initialization error: " << emsg << endl; XSEC_RELEASE_XMLCH(emsg); return 3; } catch (...) { cerr << "Unexpected Xerces or XMLSecurity initialization error." << endl; } FILE* fp = fopen (argv[2], "r"); if ( fp == 0 ) { perror(argv[2]); return 4; } EVP_PKEY* Target = PEM_read_PrivateKey(fp, 0, 0, 0); fclose(fp); if ( Target == 0 ) { cerr << "Error reading RSA key in file " << std::string(argv[2]) << endl; return 5; } std::string XML_doc; Result_t result = ReadFileIntoString(argv[1], XML_doc); if ( KM_FAILURE(result) ) { cerr << "Error reading XML file " << std::string(argv[1]) << endl; return 6; } if ( decrypt_kdm(XML_doc, Target) != 0 ) return 1; return 0; } // // end kdm-decrypt.cpp //
This program reads a JPEG 2000 codestream from a file and produces parametric data on the standard output. The following example illustrates this usage:
$ j2c-scan test_frame_000002.j2c coding parameters digital cinema profile: none rsiz capabilities: standard pixel offset from top-left corner: (0, 0) tile width/height in pixels: (2048, 1080) image width/height in tiles: (1, 1) tile #1 coding style: 1 progression order: Component-Position-Resolution-Layer POC marker flag: 0 number of quality layers: 1 rate for layer #1: 0.0 multi-component transform flag: 1 component #1 coding style: 1 number of resolutions: 6 code block width/height: (5, 5) code block coding style: 0 discrete wavelet transform identifier: 0 quantization style: 2 number of guard bits: 1 step size pairs: 16 region of interest shift: 0 component #2 coding style: 1 number of resolutions: 6 code block width/height: (5, 5) code block coding style: 0 discrete wavelet transform identifier: 0 quantization style: 2 number of guard bits: 1 step size pairs: 16 region of interest shift: 0 component #3 coding style: 1 number of resolutions: 6 code block width/height: (5, 5) code block coding style: 0 discrete wavelet transform identifier: 0 quantization style: 2 number of guard bits: 1 step size pairs: 16 region of interest shift: 0
/* * j2c-scan.cpp -- parse j2c file and display data concerning it * * $Id$ * * This program requires version 1.5.2 of the OpenJPEG * library. Furthermore, it requires the header files "openjpeg.h" and * "j2k.h" from its source distribution. Copy these headers to your * build directory. After doing so, execute the following to build: * $ c++ -o j2c-scan j2c-scan.cpp -lopenjpeg */ #include <stdio.h> #include <string.h> #include <stdlib.h> #include "openjpeg.h" #include "j2k.h" static void j2k_dump_cp (opj_image_t * image, opj_cp_t * cp) { const char *s; int i, j; int step_size_pairs; printf ("coding parameters\n"); if (cp->comment != NULL) { printf (" coding comment: %p\n", cp->comment); } switch (cp->cinema) { case OFF: s = "none"; break; case CINEMA2K_24: s = "2k @ 24 fps"; break; case CINEMA2K_48: s = "2k @ 48 fps"; break; case CINEMA4K_24: s = "4k @ 24 fps"; break; default: s = "unknown"; break; } printf (" digital cinema profile: %s\n", s); switch (cp->rsiz) { case STD_RSIZ: s = "standard"; break; case CINEMA2K: s = "2k digital cinema"; break; case CINEMA4K: s = "4k digital cinema"; break; default: s = "unknown"; break; } printf (" rsiz capabilities: %s\n", s); printf (" pixel offset from top-left corner: (%d, %d)\n", cp->tx0, cp->ty0); printf (" tile width/height in pixels: (%d, %d)\n", cp->tdx, cp->tdy); printf (" image width/height in tiles: (%d, %d)\n", cp->tw, cp->th); for (i = 0; i < cp->tw * cp->th; i++) { printf (" tile #%d\n", i + 1); printf (" coding style: %x\n", cp->tcps[i].csty); switch (cp->tcps[i].prg) { case LRCP: s = "Layer-Resolution-Component-Position"; break; case RLCP: s = "Resolution-Layer-Component-Position"; break; case RPCL: s = "Resolution-Position-Component-Layer"; break; case PCRL: s = "Position-Component-Resolution-Layer"; break; case CPRL: s = "Component-Position-Resolution-Layer"; break; default: s = "unknown"; break; } printf (" progression order: %s\n", s); printf (" POC marker flag: %d\n", cp->tcps[i].POC); printf (" number of quality layers: %d\n", cp->tcps[i].numlayers); for (j = 0; j < cp->tcps[i].numlayers; j++) { printf (" rate for layer #%d: %.1f\n", j + 1, cp->tcps[i].rates[j]); } printf (" multi-component transform flag: %d\n", cp->tcps[i].mct); for (j = 0; j < image->numcomps; j++) { printf (" component #%d\n", j + 1); printf (" coding style: %x\n", cp->tcps[i].tccps[j].csty); printf (" number of resolutions: %d\n", cp->tcps[i].tccps[j].numresolutions); printf (" code block width/height: (%d, %d)\n", cp->tcps[i].tccps[j].cblkw, cp->tcps[i].tccps[j].cblkh); printf (" code block coding style: %x\n", cp->tcps[i].tccps[j].cblksty); printf (" discrete wavelet transform identifier: %d\n", cp->tcps[i].tccps[j].qmfbid); printf (" quantization style: %d\n", cp->tcps[i].tccps[j].qntsty); printf (" number of guard bits: %d\n", cp->tcps[i].tccps[j].numgbits); step_size_pairs = (cp->tcps[i].tccps[j].qntsty == J2K_CCP_QNTSTY_SIQNT) ? 1 : cp->tcps[i].tccps[j].numresolutions * 3 - 2; printf (" step size pairs: %d\n", step_size_pairs); printf (" region of interest shift: %d\n", cp->tcps[i].tccps[j].roishift); } } } void error_callback (const char *msg, void *client_data) { FILE *stream = (FILE *) client_data; fprintf (stream, "[ERROR] %s", msg); } void warning_callback (const char *msg, void *client_data) { FILE *stream = (FILE *) client_data; fprintf (stream, "[WARNING] %s", msg); } int main (int argc, char *argv[]) { char *filename; /* name of the file to process */ FILE *fp; /* input file pointer */ int file_length; /* length of the input file */ unsigned char *buffer = NULL; /* in-memory buffer containing the input file */ opj_cio_t *cio = NULL; /* OpenJPEG wrapper around file buffer */ opj_dparameters_t parameters; /* decompression parameters */ opj_dinfo_t *dinfo = NULL; /* pointer to a JPEG-2000 decompressor */ opj_event_mgr_t event_mgr; /* manager of events' callback functions */ opj_image_t *image = NULL; /* pointer to the decoded image */ memset (&event_mgr, 0, sizeof (opj_event_mgr_t)); event_mgr.error_handler = error_callback; event_mgr.warning_handler = warning_callback; event_mgr.info_handler = NULL; /* establish default decoding parameters for JPEG-2000 codestreams */ opj_set_default_decoder_parameters (¶meters); parameters.decod_format = 0; if (argc != 2) { fprintf (stderr, "USAGE: j2c-scan file.j2c\n"); return 1; } filename = argv[1]; strncpy (parameters.infile, filename, sizeof (parameters.infile) - 1); /* read the input file and put it in memory */ fp = fopen (parameters.infile, "rb"); if (fp == NULL) { perror ("fopen"); return 2; } fseek (fp, 0, SEEK_END); file_length = (int) ftell (fp); fseek (fp, 0, SEEK_SET); buffer = (unsigned char *) malloc (file_length); fread (buffer, sizeof (unsigned char), file_length, fp); fclose (fp); /* decode the JPEG-2000 codestream */ dinfo = opj_create_decompress (CODEC_J2K); opj_set_event_mgr ((opj_common_ptr) dinfo, &event_mgr, stderr); opj_setup_decoder (dinfo, ¶meters); cio = opj_cio_open ((opj_common_ptr) dinfo, buffer, file_length); image = opj_decode (dinfo, cio); if (image == NULL) { fprintf (stderr, "ERROR -> j2c-scan: failed to decode image!\n"); opj_destroy_decompress (dinfo); opj_cio_close (cio); free (buffer); return 1; } opj_cio_close (cio); free (buffer); /* display information about the image */ j2k_dump_cp (image, ((opj_j2k_t *) dinfo->j2k_handle)->cp); /* free the memory */ opj_destroy_decompress (dinfo); opj_image_destroy (image); return 0; }
The section "eab_calc.py" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
This
program
reads
one
or
more
XML
files
containing
d-cinema
metadata
and
tests
each
of
the
UUID
values
for
compliance
with
[RFC-4122]
.
The
program
will
emit
a
message
on
stderr
for
each
malformed
UUID
that
is
encountered.
The
following
example
illustrates
this
usage
for
a
KDM
file:
$ uuid_check.py Example.kdm.xml UUID: 7556bff9-58f9-4320-bb1f-fb594219a957 UUID: bdb3a717-5062-4822-8dfc-0dc6570cc116 UUID: 71f7926e-8ce6-4763-b14b-0ef7dcd952f5 UUID: 6083adad-472c-43da-b131-c6dc601cd154 UUID: aeaae312-a257-11da-a601-8b319b685f8e
#!/usr/bin/env python # # uuid_check.py -- Scan an XML file and see that all UUID values # conform to RFC-4122 # # $Id$ # from __future__ import print_function import sys, re # regular expressions for use below urn_uuid_re = re.compile('urn:uuid:([^<]*)') uuid_re = re.compile('^[0-9a-f]{8}-[0-9a-f]{4}-\ ([1-5])[0-9a-f]{3}-[8-9a-b][0-9a-f]{3}-[0-9a-f]{12}$', re.IGNORECASE) # def uuid_scan(text): uuid_list = [] while text: match = urn_uuid_re.search(text) if not match: break uuid_val = match.group(1) text = text[match.end():] match = uuid_re.match(uuid_val) if not match: sys.stderr.write("urn:uuid: value is not an RFC-4122 UUID: %s\n" % (uuid_val)) continue type = int(match.group(1)[0]) if type not in (1, 4, 5): sys.stderr.write("Unexpected UUID type: %d for value %s\n" % (type, uuid_val)) uuid_list.append(uuid_val) return uuid_list # # if len(sys.argv) < 2: sys.stderr.write("usage: uuid_check.py <xml-file> [...]\n") sys.exit(1) for filename in sys.argv[1:]: try: handle = open(filename) text = handle.read() handle.close() except Exception as e: print("{0}: {1}".format(filename, e)) else: for uuid in uuid_scan(text): print("UUID: {0}".format(uuid)) # # end uuid_check.py #
This program reads a signed XML file and re-writes the file to the standard output using the certificate order expected by the checksig program from the XML Security package. The following example illustrates this usage for a KDM file:
$ dsig_cert.py test-kdm.xml >tmp.xml $ checksig tmp.xml Signature verified OK!
#!/usr/bin/env python # # dsig_cert.py -- Re-order certificates in an XML signature # # NOTE: This program requires Python 2.7 or greater # # $Id$ # from __future__ import print_function import sys, re from subprocess import Popen, PIPE # regular expressions for use below SignatureValue_end_re = re.compile('</(?:[\w\-]+:)?SignatureValue[^>]*>') X509Data_re = re.compile('<(?:[\w\-]+:)X509Data[^>]*>(.*?)</(?:[\w\-]+:)X509Data\s*>\s+', re.DOTALL) X509Certificate_re = re.compile('X509Certificate[^>]*>(.*?)</', re.DOTALL) dnQualifier_re = re.compile('dnQualifier=([\\w+/]+=)') # def get_dnq_type(pem_text, type): """Extract the dnQualifier value for the given certificate and common name.""" handle = Popen(('/usr/bin/openssl', 'x509', '-noout', '-'+type), stdin=PIPE, stdout=PIPE, close_fds=True) handle.stdin.write(pem_text) handle.stdin.close() name_text = handle.stdout.read() handle.wait() if handle.returncode != 0: raise Exception("No X509Certificate element in {0}".format(pem_text)) dnq = dnQualifier_re.search(name_text.replace('\\', '')) if not dnq: raise Exception("Error retrieving dnQualifier from {0}.".format(type)) return dnq.group(1) # def PEMify(base64_text): """ create canonical PEM lines from any base64 input""" in_text = re.sub('[\r\n]', '', base64_text) idx = 0 end = len(in_text) retval = '' while idx < end: retval += in_text[idx:idx+64] + '\n' idx += 64 return retval # class dsig_certificate_set: """An object for manipulating XML Signature certificates.""" def __init__(self, xml_doc): """Initialize with a signed XML document string.""" body_end = SignatureValue_end_re.search(xml_doc) if not body_end: raise Exception("Document does not contain a SignatureValue element.") self.kdm_head = xml_doc[:body_end.end()] xml_doc = xml_doc[body_end.end():] self.X509Data_list = [] x509_data = X509Data_re.search(xml_doc) if x509_data: self.kdm_head += xml_doc[:x509_data.start()] while x509_data: x509_text = xml_doc[x509_data.start():x509_data.end()] self.X509Data_list.append({ 'text': x509_text }) xml_doc = xml_doc[x509_data.end():] x509_data = X509Data_re.search(xml_doc) self.kdm_tail = xml_doc for x509_data in self.X509Data_list: # extract the certificate cert = X509Certificate_re.search(x509_data['text']) if not cert: raise Exception("No X509Certificate element in {0}".format(x509_data['text'])) cert = PEMify(cert.group(1)) cert = "-----BEGIN CERTIFICATE-----\n%s-----END CERTIFICATE-----\n" % (cert) x509_data['subject_dnq'] = get_dnq_type(cert, 'subject') x509_data['issuer_dnq'] = get_dnq_type(cert, 'issuer') x509_data['pem_cert'] = cert def order_by_dnq(self): """Arrange certificates in leaf-root order.""" root = None issuer_map = {} for x509_data in self.X509Data_list: if x509_data['subject_dnq'] == x509_data['issuer_dnq']: if root: raise Exception("Certificate list contains multiple roots.") root = x509_data else: issuer_map[x509_data['issuer_dnq']] = x509_data if not root: raise Exception("Self-signed root certificate not found.") tmp_list = [root]; try: key = tmp_list[-1]['subject_dnq'] next = issuer_map[key] while next: tmp_list.append(next) key = tmp_list[-1]['subject_dnq'] next = issuer_map[key] except: pass if len(self.X509Data_list) != len(tmp_list): raise Exception("Certificates do not form a complete chain.") tmp_list.reverse() self.X509Data_list = tmp_list return self def write_certs(self, prefix='cert_set_'): """Write PEMcertificates to files using the optional filename prefix value.""" count = 1 for x509_data in self.X509Data_list: filename = "%s%d.pem" % (prefix, count) handle = open(filename, 'w') handle.write(x509_data['pem_cert']) handle.close() count += 1 def __repr__(self): cert_text = '' for cert in self.X509Data_list: cert_text += cert['text'] return self.kdm_head + cert_text + self.kdm_tail # if __name__ == '__main__': if len(sys.argv) < 2: sys.stderr.write("usage: dsig_cert.py <xml-file>\n") sys.exit(1) try: handle = open(sys.argv[1]) text = handle.read() handle.close() cert_set = dsig_certificate_set(text) cert_set.order_by_dnq() print(cert_set) except Exception as e: print(e) # # end dsig_cert.py #
This
program
reads
a
signed
XML
file
and
writes
the
certificates
contained
within
to
individual
PEM
files.
As
shown
below,
the
-p
option
can
be
used
to
provide
a
prefix
for
the
automatically-generated
filenames.
In
this
example,
the
input
document
contained
four
certificates.
$ dsig_extract.py -p my_prefix_ test-kdm.xml $ ls my_prefix_* my_prefix_1.pem my_prefix_2.pem my_prefix_3.pem my_prefix_4.pem
#!/usr/bin/env python # # dsig_extract.py -- Extract certificates from an XML signature # # $Id$ # from __future__ import print_function from dsig_cert import dsig_certificate_set import sys prefix = 'xmldsig_cert_' filename = None def usage(): sys.stderr.write("usage: dsig_extract.py [-p <prefix>] <xml-file>\n") sys.exit(1) if len(sys.argv) < 2: usage() if sys.argv[1] == '-p': if len(sys.argv) < 4: usage() prefix = sys.argv[2] filename = sys.argv[3] else: filename = sys.argv[1] try: handle = open(filename) text = handle.read() handle.close() set = dsig_certificate_set(text) set.write_certs(prefix=prefix) except Exception as e: print(e) # # end dsig_extract.py #
The section "ASM Simulator" was deleted. The section number is maintained here to preserve the numbering of subsequent sections.
The GPIO test fixture has eight outputs, which connect to ground via normally-open switch contacts. These outputs are expected to interface to command and/or status inputs of the d-cinema equipment under test.
The fixture has eight inputs, which connect to powered, current limited LEDs and will illuminate when the corresponding input is grounded. These inputs interface to command and/or status outputs of the d-cinema equipment under test.
Example circuits are provided below. Interface of outputs, inputs and ground is made via a single DB-25 female connector on the test fixture.
Testing Entities are not required to follow the above design, and are free to develop their own equipment and connector standards. The manufacturer of the d-cinema equipment being tested is responsible for providing a cable, appropriate for the individual Test Subject, that will interface to the test fixture being used.
Note that the LED inputs are internally current limited. External devices will be expected to sink 25mA per channel. Also, the test fixture has an integral PSU (the PSU may be external but it must use a different connector).
The following describes evaluation requirements, basic and specific pass/fail criteria to be used when testing the subtitle rendering capabilities of the Test Subject, as called for in 6.7.1. Media Block Overlay .
It is expected that the Test Operator shall, by referencing both the CPL and the referenced subtitle track file XML, confirm that all elements expected to appear on (or off) screen for each scene, do so with all intended characteristics. This includes, but is not limited to, positioning, alignment, font size, color, script, effect, effect color, italic, underline, bold, aspect adjust and spacing, whether specified directly, default values, or inherited from ancestor values or defaults.
The colorimetric relationship between the PNG image and the image contained in the DCDM* is, at the time of publication of this document, under study. Unexpected appearance of saturation, hue, luminance and bit-depth of PNG images should be noted in the test results and brought to the attention of the manufacturer pending further work to quantify this relationship. At this time, the Test Subject shall not be subject to failing this test on such characteristics.
The
behavior
of
the
Direction
attribute
of
the
Text
element
and/or
the
Unicode
Bidirectional
Algorithm
is,
at
the
time
of
publication
of
this
document,
under
study.
Scenes
that
utilize
these
features
should
be
noted
in
the
test
results
and
brought
to
the
attention
of
the
manufacturer
pending
further
work
to
quantify
this
relationship.
At
this
time,
the
Test
Subject
shall
not
be
subject
to
failing
this
test
on
such
characteristics
This section describes general pass/fail criteria to be applied to all scenes unless Specific Criteria directs otherwise.
Main Picture Image Track Files - Labels : The image track files referenced by the compositions have a burned-in label in a small text font, centered horizontally and close to the bottom of the main picture. The label is comprised of the image structure of the sequence being viewed (2K, 4K, or 2K-48fps), the aspect ratio, and the name of the scene. The name of the scene may be used to locate specific pass/fail requirements for a particular scene, and descriptive notes to the test operator, providing additional context.
Bounding Boxes : White bounding boxes, with a one pixel size, are burned into the image track files to confirm correct positioning of the rendered timed text. Some slides, mainly those that announce upcoming scenes, present timed text that do not have bounding boxes. When bounding boxes are displayed, the associated text is intended to fall completely within the boxes. Differences in implementations can produce significant differences in the vertical positioning of text, depending on whether the renderer uses the baseline of the text characters, or the edges of the rendered characters for positioning. Exceeding the bounding boxes shall not be cause to fail the test.
Composition
Main
Picture
and
Alpha
Channel
Timing
:
The
appearance
of
a
particular
label
on
the
main
picture,
is
intended
to
be
accompanied
by
that
scene's
rendered
text,
and/or
PNG
images,
as
determined
by
the
provided
FadeUpTime
and
FadeDownTime
Text
and/or
Image
element
parameters,
or
their
default
value
of
2
frames
if
none
are
specified.
For
example,
for
the
beginning
title
slide,
"2K-scope-title",
the
image
track
file
will
display
the
label
for
240
frames
(10
seconds).
The
FadeUpTime
and
FadeDownTime
parameters
are
not
specified
for
the
accompanying
timed
text,
so
for
the
1st
frame
that
displays
the
image
label
no
timed
text
should
be
visible,
the
2nd
frame
should
have
the
timed
text
at
50%
opacity
of
the
rendered
intent
(the
element's
final
opacity
is
specified
by
a
parameter
of
the
Font
element),
and
frames
3
until
238
inclusive
should
have
the
timed
text
at
100%
opacity
of
the
rendered
intent.
Frame
239
should
be
identical
to
frame
2,
and
frame
240
should
have
no
visible
timed
text.
Except
where
specified,
if
the
timing
of
the
rendered
text
and/or
PNG
images
differs
by
more
than
plus
or
minus
3
frames
from
that
commanded
by
the
Subtitle
DCDM
and
the
controlling
CPL,
this
is
cause
to
fail
the
test.
Some
of
the
slides
that
test
PNG
images
have
FadeUpTime
and
FadeDownTime
values
of
zero,
so
it
is
not
expected,
or
correct
behavior,
for
the
corresponding
image
track
labels
to
be
visible
with
these
slides.
This section lists pass/fail criteria specific to each scene in the composition. The identifier for each scene is constructed by prepending the image structure and aspect-ratio for the variant under test to the scene title. E.g. "2K-scope-title" or "2K-48fpsflat-title" as applicable. For each of the scenes, refer to the text below the identifier. Descriptions of scene specific elements may be described more fully. Scene specific pass/fail requirements are presented as bullet lists.
FadeDownTime
and
FadeUpTime
elements
are
set
to
00:00:00:00
which
will
result
in
a
seamless
transition
between
the
subtitle
instances.
FadeUpTime
and
FadeDownTime
elements,
or
their
defaults,
may
be
considered
to
be
equal
to
zero.
If
any
of
the
fades
specified
in
the
XML
are
rendered
as
if
they
were
zero,
this
shall
not
be
cause
to
fail
this
test.
This appendix specifies requirements and expected acoustic outcome from the rendering of OBAE Rendering Expectations by an OBAE Sound System .
The OBAE Sound System shall be configured as an Ideal Environment , as specified at [SMPTE-2098-3] .
The test material consists of a sequence of scenes, identified in the top right corner of the image, as illustrated at Figure J.1 .
The expectations for each scene are described using a combination of text and images, as illustrated at Figure J.1 .
Sounds heard during each scene shall conform to the expectations specified for the scene by the corresponding table in J.4. Expectations .
There shall be no sounds heard other than those specified at J.4. Expectations .
Expectations that refer to a specific loudpspeaker, e.g. "Left Speaker", shall be skipped if the OBAE Sound System is not equipped with that loudpspeaker.
Loudspeakers are defined at [SMPTE-428-12] and [SMPTE-2098-5] .
All sounds heard shall be free of artifacts, e.g. "zipper" noise, discontinuities, clicks.
Text on screen | Expectations |
---|---|
BEEP! | A sound is heard |
Text on screen | Expectations |
---|---|
You should hear SPEAKING on 'Left' | The word "left" is heard from the Left speaker |
You should hear PINK on 'Left' | Pink noise is heard from the Left speaker |
You should hear SPEAKING on 'Right' | The word "right" is heard from the Right speaker |
You should hear PINK on 'Right' | Pink noise is heard from the Right speaker |
You should hear SPEAKING on 'Center' | The word "center" is heard from the Center speaker |
You should hear PINK on 'Center' | Pink noise is heard from the Center speaker |
You should hear SPEAKING on 'LFE' | Sound is heard from the LFE speaker |
You should hear PINK on 'LFE' | Pink noise is heard from the LFE speaker |
You should hear SPEAKING on 'Left Surround' | The word "left surround" is heard from the Left Surround speaker |
You should hear PINK on 'Left Surround' | Pink noise is heard from the Left Surround speaker |
You should hear SPEAKING on 'Right Surround' | The word "right surround" is heard from the Right Surround speaker |
You should hear PINK on 'Right Surround' | Pink noise is heard from the Right Surround speaker |
Text on screen | Expectations |
---|---|
You should hear SPEAKING on 'Left' | The word "left" is heard from the Left speaker |
You should hear PINK on 'Left' | Pink noise is heard from the Left speaker |
You should hear SPEAKING on 'Right' | The word "right" is heard from the Right speaker |
You should hear PINK on 'Right' | Pink noise is heard from the Right speaker |
You should hear SPEAKING on 'Center' | The word "center" is heard from the Center speaker |
You should hear PINK on 'Center' | Pink noise is heard from the Center speaker |
You should hear SPEAKING on 'LFE' | Sound is heard from the LFE speaker |
You should hear PINK on 'LFE' | Pink noise is heard from the LFE speaker |
You should hear SPEAKING on 'Left Side Surround' | The word "left side surround" is heard from the Left Side Surround speaker |
You should hear PINK on 'Left Side Surround' | Pink noise is heard from the Left Side Surround speaker |
You should hear SPEAKING on 'Right Side Surround' | The word "right side surround" is heard from the Right Side Surround speaker |
You should hear PINK on 'Right Side Surround' | Pink noise is heard from the Right Side Surround speaker |
You should hear SPEAKING on 'Left Rear Surround' | The word "left rear surround" is heard from the Left Rear Surround speaker |
You should hear PINK on 'Left Rear Surround' | Pink noise is heard from the Left Rear Surround speaker |
You should hear SPEAKING on 'Right Rear Surround' | The word "right rear surround" is heard from the Right Rear Surround speaker |
You should hear PINK on 'Right Rear Surround' | Pink noise is heard from the Right Rear Surround speaker |
Text on screen | Expectations |
---|---|
You should hear SPEAKING on 'Left' | The word "left" is heard from the Left speaker |
You should hear PINK on 'Left' | Pink noise is heard from the Left speaker |
You should hear SPEAKING on 'Right' | The word "right" is heard from the Right speaker |
You should hear PINK on 'Right' | Pink noise is heard from the Right speaker |
You should hear SPEAKING on 'Center' | The word "center" is heard from the Center speaker |
You should hear PINK on 'Center' | Pink noise is heard from the Center speaker |
You should hear SPEAKING on 'LFE' | Sound is heard from the LFE speaker |
You should hear PINK on 'LFE' | Sound is heard from the LFE speaker |
You should hear SPEAKING on 'Left Side Surround' | The word "left side surround" is heard from the Left Side Surround speaker |
You should hear PINK on 'Left Side Surround' | Pink noise is heard from the Left Side Surround speaker |
You should hear SPEAKING on 'Right Side Surround' | The word "right side surround" is heard from the Right Side Surround speaker |
You should hear PINK on 'Right Side Surround' | Pink noise is heard from the Right Side Surround speaker |
You should hear SPEAKING on 'Left Rear Surround' | The word "left rear surround" is heard from the Left Rear Surround speaker |
You should hear PINK on 'Left Rear Surround' | Pink noise is heard from the Left Rear Surround speaker |
You should hear SPEAKING on 'Right Rear Surround' | The word "right rear surround" is heard from the Right Rear Surround speaker |
You should hear PINK on 'Right Rear Surround' | Pink noise is heard from the Right Rear Surround speaker |
You should hear SPEAKING on 'Left Top Surround' | The word "left top surround" is heard from the Left Top Surround speaker |
You should hear PINK on 'Left Top Surround' | Pink noise is heard from the Left Top Surround speaker |
You should hear SPEAKING on 'Right Top Surround' | The word "right top surround" is heard from the Right Top Surround speaker |
You should hear PINK on 'Right Top Surround' | Pink noise is heard from the Right Top Surround speaker |
Text on screen | Expectations |
---|---|
You
should
hear
pink-noise
GainPrefix = ONE |
Pink noise is heard. The noise maintains consistent timbre, loudness, and size. |
You
should
hear
Silence
GainPrefix = zero |
No sound is heard. |
You
should
hear
the
volume
changing
GainPrefix = CUSTOM gain=X.XX |
Pink noise is heard. For three times in a row, the loudness of the pink noise monotically increases from silence, and then monotically decrease back to silence. The loudness scales with the value of gain . The noise maintains consistent timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
pink
noise
ChannelDecorCoefPrefix = NONE |
Pink noise is heard. |
You
should
hear
pink
noise
ChannelDecorCoefPrefix = MAX |
Pink noise is heard. |
You
should
hear
pink
noise
ChannelDecorCoefPrefix = CUSTOM XX% |
Pink noise is heard. The noise maintains consistent loudness. |
Text on screen | Expectations |
---|---|
You
should
not
hear
pink
noise
The 13.1HT pink noise is superseded by one of the 5.1, 7.1DS, 9.1OH beds |
No sound is heard. |
Text on screen | Expectations |
---|---|
You should only hear pink noise on Center (NOT from Left) | Pink noise is heard from the Center speaker only |
You should only hear pink noise on Right (NOT from Center) | Pink noise is heard from the Right speaker only |
You should only hear pink noise on Left (NOT from Right) | Pink noise is heard from the Left speaker only |
You should only hear pink noise on Center (NOT from Left Side Surround) | Pink noise is heard from the Center speaker only |
You should only hear pink noise on Left (NOT from Right Side Surround) | Pink noise is heard from the Left speaker only |
You should only hear pink noise on Center (NOT from Left Rear Surround) | Pink noise is heard from the Center speaker only |
You should only hear pink noise on Left (NOT from Right Rear Surround) | Pink noise is heard from the Left speaker only |
You should only hear pink noise on Right (NOT from LFE) | Pink noise is heard from the Right speaker only |
You should only hear pink noise on Center (NOT from Left Height) | Pink noise is heard from the Center speaker only |
You should only hear pink noise on Right (NOT from Center Height) | Pink noise is heard from the Right speaker only |
You should only hear pink noise on Left (NOT from Right Height) | Pink noise is heard from the Left speaker only |
You should only hear pink noise on Center (NOT from Left Surround Height) | Pink noise is heard from the Center speaker only |
You should only hear pink noise on Left (NOT from Right Surround Height) | Pink noise is heard from the Left speaker only |
You should only hear pink noise on Right (NOT from Top Surround) | Pink noise is heard from the Right speaker only |
The noise maintains consistent timbre and loudness in each test. |
Text on screen | Expectations |
---|---|
You should hear: SPEAKING on 'Left' | The word "left" is heard from the Left speaker |
You should hear: PINK on 'Left' | Pink noise is heard from the Left speaker |
You should hear: SPEAKING on 'Left Center' | The word "left center" is heard from the Left Center speaker |
You should hear: PINK on 'Left Center' | Pink noise is heard from the Left Center speaker |
You should hear: SPEAKING on 'Center' | The word "center" is heard from the Center speaker |
You should hear: PINK on 'Center' | Pink noise is heard from the Center speaker |
You should hear: SPEAKING on 'Right Center' | The word "right center" is heard from the Right Center speaker |
You should hear: PINK on 'Right Center' | Pink noise is heard from the Right Center speaker |
You should hear: SPEAKING on 'Right' | The word "right" is heard from the Right speaker |
You should hear: PINK on 'Right' | Pink noise is heard from the Right speaker |
You should hear: SPEAKING on 'Left Side Surround' | The word "left side surround" is heard from the Left Side Surround speaker |
You should hear: PINK on 'Left Side Surround' | Pink noise is heard from the Left Side Surround speaker |
You should hear: SPEAKING on 'Left Surround' | The word "left surround" is heard from the Left Surround speaker |
You should hear: PINK on 'Left Surround' | Pink noise is heard from the Left Surround speaker |
You should hear: SPEAKING on 'Left Rear Surround' | The word "left rear surround" is heard from the Left Rear Surround speaker |
You should hear: PINK on 'Left Rear Surround' | Pink noise is heard from the Left Rear Surround speaker |
You should hear: SPEAKING on 'Right Rear Surround' | The word "right rear surround" is heard from the Right Rear Surround speaker |
You should hear: PINK on 'Right Rear Surround' | Pink noise is heard from the Right Rear Surround speaker |
You should hear: SPEAKING on 'Right Side Surround' | The word "right side surround" is heard from the Right Side Surround speaker |
You should hear: PINK on 'Right Side Surround' | Pink noise is heard from the Right Side Surround speaker |
You should hear: SPEAKING on 'Right Surround' | The word "right surround" is heard from the Right Surround speaker |
You should hear: PINK on 'Right Surround' | Pink noise is heard from the Right Surround speaker |
You should hear: SPEAKING on 'Left Top Surround' | The word "left top surround" is heard from the Left Top Surround speaker |
You should hear: PINK on 'Left Top Surround' | Pink noise is heard from the Left Top Surround speaker |
You should hear: SPEAKING on 'Right Top Surround' | The word "right top surround" is heard from the Right Top Surround speaker |
You should hear: PINK on 'Right Top Surround' | Pink noise is heard from the Right Top Surround speaker |
You should hear: SPEAKING on 'LFE' | Sound is heard from the LFE speaker |
You should hear: PINK on 'LFE' | Sound is heard from the LFE speaker |
You should hear: SPEAKING on 'Left Height' | The word "left height" is heard from the Left Height speaker |
You should hear: PINK on 'Left Height' | Pink noise is heard from the Left Height speaker |
You should hear: SPEAKING on 'Right Height' | The word "right height" is heard from the Right Height speaker |
You should hear: PINK on 'Right Height' | Pink noise is heard from the Right Height speaker |
You should hear: SPEAKING on 'Center Height' | The word "center height" is heard from the Center Height speaker |
You should hear: PINK on 'Center Height' | Pink noise is heard from the Center Height speaker |
You should hear: SPEAKING on 'Left Surround Height' | The word "left surround height" is heard from the Left Surround Height speaker |
You should hear: PINK on 'Left Surround Height' | Pink noise is heard from the Left Surround Height speaker |
You should hear: SPEAKING on 'Right Surround Height' | The word "right surround height" is heard from the Right Surround Height speaker |
You should hear: PINK on 'Right Surround Height' | Pink noise is heard from the Right Surround Height speaker |
You should hear: SPEAKING on 'Left Side Surround Height' | The word "left side surround height" is heard from the Left Side Surround Height speaker |
You should hear: PINK on 'Left Side Surround Height' | Pink noise is heard from the Left Side Surround Height speaker |
You should hear: SPEAKING on 'Right Side Surround Height' | The word "right side surround height" is heard from the Right Side Surround Height speaker |
You should hear: PINK on 'Right Side Surround Height' | Pink noise is heard from the Right Side Surround Height speaker |
You should hear: SPEAKING on 'Left Rear Surround Height' | The word "left rear surround height" is heard from the Left Rear Surround Height speaker |
You should hear: PINK on 'Left Rear Surround Height' | Pink noise is heard from the Left Rear Surround Height speaker |
You should hear: SPEAKING on 'Right Rear Surround Height' | The word "right rear surround height" is heard from the Right Rear Surround Height speaker |
You should hear: PINK on 'Right Rear Surround Height' | Pink noise is heard from the Right Rear Surround Height speaker |
You should hear: SPEAKING on 'Top Surround' | The word "top surround" is heard from the Top Surround speaker |
You should hear: PINK on 'Top Surround' | Pink noise is heard from the Top Surround speaker |
Text on screen | Expectations |
---|---|
You
should
hear
pink
noise
GainPrefix = ONE |
Pink noise is heard. The noise maintains consistent timbre, location, size and loudness. |
You
should
hear
silence
GainPrefix = ZERO |
No sound is heard. |
You
should
hear
the
volume
changing
GainPrefix = CUSTOME (X.XX) |
Pink noise is heard. Three times in a row, the loudness of the pink noise monotically increases from silence, and then monotically decrease back to silence. The loudness scales with the value of GainPrefix . The noise maintains consistent timbre and size. |
Text on screen | Expectations |
---|---|
Snap
off
Angle: XXX° |
A point source emitting pink noise is heard, making four revolutions clockwise around the room. The motion of the point source is continuous. |
Snap
On
Angle: XXX° |
A point source emitting pink noise is heard, making four revolutions clockwise around the room. Sound is heard from only one loudspeaker at any given time. |
SnapTolerance
Angle: XXX° |
A point source emitting pink noise is heard, making four revolutions clockwise around the room. Sound may be heard from one or more loudspeakers at any given time. |
The noise maintains consistent timbre, loudness, and size in each test. |
Text on screen | Expectations |
---|---|
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN LEFT zone |
Pink noise is heard only from screen loudspeakers left of center. |
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN CENTER zone |
Pink noise is heard only from screen center loudspeakers. |
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN RIGHT zone |
Pink noise is heard only from screen loudspeakers right of center. |
Angle:
XXX°
You should hear pink noise ONLY in the WALL LEFT zone |
Pink noise is heard only from loudspeakers on left wall. |
Angle:
XXX°
You should hear pink noise ONLY in the WALL RIGHT zone |
Pink noise is heard only from loudspeakers on right wall. |
Angle:
XXX°
You should hear pink noise ONLY in the REAR LEFT zone |
Pink noise is heard only from loudspeakers on left half of rear wall. |
Angle:
XXX°
You should hear pink noise ONLY in the REAR RIGHT zone |
Pink noise is heard only from loudspeakers on right half of rear wall. |
Angle:
XXX°
You should hear pink noise ONLY in the OVERHEAD LEFT zone |
Pink noise is heard only from overhead loudspeakers left of center. |
Angle:
XXX°
You should hear pink noise ONLY in the OVERHEAD RIGHT zone |
Pink noise is heard only from overhead loudspeakers right of center. |
The noise maintains consistent timbre, loudness, and size in each test. |
Text on screen | Expectations |
---|---|
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN LEFT zone |
Pink noise is heard only from screen loudspeakers left of center. |
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN CENTER zone |
Pink noise is heard only from screen center loudspeakers. |
Angle:
XXX°
You should hear pink noise ONLY in the SCREEN RIGHT zone |
Pink noise is heard only from screen loudspeakers right of center. |
Angle:
XXX°
You should hear pink noise ONLY in the WALL LEFT zone |
Pink noise is heard only from loudspeakers on left wall. |
Angle:
XXX°
You should hear pink noise ONLY in the WALL RIGHT zone |
Pink noise is heard only from loudspeakers on right wall. |
Angle:
XXX°
You should hear pink noise ONLY in the REAR LEFT zone |
Pink noise is heard only from loudspeakers on left half of rear wall. |
Angle:
XXX°
You should hear pink noise ONLY in the REAR RIGHT zone |
Pink noise is heard only from loudspeakers on right half of rear wall. |
Angle:
XXX°
You should hear pink noise ONLY in the OVERHEAD LEFT zone |
Pink noise is heard only from overhead loudspeakers left of center. |
Angle:
XXX°
You should hear pink noise ONLY in the OVERHEAD RIGHT zone |
Pink noise is heard only from overhead loudspeakers right of center. |
The noise maintains consistent timbre, loudness, and size in each test. |
Text on screen | Expectations |
---|---|
Location:
Overhead
Spread Mode: Low-Rez Spread Value: X.XX |
Pink noise is heard centered overhead. The perceived extent of the sound source may flucuates with the Spread Value . |
Location:
Overhead
Spread Mode: One-D Spread Value: X.XX |
Pink noise is heard centered overhead. The perceived extent of the sound source may flucuates with the Spread Value . |
Location:
Overhead
Spread Mode: Three-D Spread Value: x=X.XX, y=X.XX, z=X.XX |
Pink noise is heard centered on the screen. The perceived extent of the sound source may flucuates with the Spread Value . |
Location:
Screen
Spread Mode: Low-Rez Spread Value: X.XX |
Pink noise is heard centered on the screen. The perceived extent of the sound source may flucuates with the Spread Value . |
Location:
Screen
Spread Mode: One-D Spread Value: X.XX |
Pink noise is heard centered on the screen. The perceived extent of the sound source may flucuates with the Spread Value . |
Location:
Screen
Spread Mode: Three-D Spread Value: x=X.XX, y=X.XX, z=X.XX |
Pink noise is heard centered on the screen. The perceived extent of the sound source may flucuates with the Spread Value . |
The noise maintains consistent loudness in each test. |
Text on screen | Expectations |
---|---|
You
should
hear
pink
noise
ObjectDecorCoefPrefix = NONE Location: Screen |
Pink noise is heard, centered on the screen. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = MAX Location: Screen |
Pink noise is heard, centered on the screen. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = CUSTOM (XX%) Location: Screen |
Pink noise is heard, centered on the screen. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = NONE Location: Rear |
Pink noise is heard, centered on the rear wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = MAX Location: Rear |
Pink noise is heard, centered on the rear wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = CUSTOM (XX%) Location: Rear |
Pink noise is heard, centered on the rear wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = NONE Location: Left |
Pink noise is heard, centered on the left wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = MAX Location: Left |
Pink noise is heard, centered on the left wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = CUSTOM (XX%) Location: Left |
Pink noise is heard, centered on the left wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = NONE Location: Right |
Pink noise is heard, centered on the right wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = MAX Location: Right |
Pink noise is heard, centered on the right wall. |
You
should
hear
pink
noise
ObjectDecorCoefPrefix = CUSTOM (XX%) Location: Right |
Pink noise is heard, centered on the right wall. |
The noise maintains consistent loudness in each test. |
Text on screen | Expectations |
---|---|
You
should
hear
a
3
note
chord
(one
note
per
object)
Spread On |
A sound is heard, centered on the screen. |
You
should
hear
a
3
note
chord
(one
note
per
object)
Snap Off, Spread Off |
A sound is heard, centered on the screen. |
You
should
hear
a
3
note
chord
(one
note
per
object)
Snap On |
A sound is heard, centered on the screen. |
You
should
hear
a
3
note
chord
(one
note
per
object)
SnapTolerance |
A sound is heard, centered on the screen. |
Text on screen | Expectations |
---|---|
Sub-blocks:
XXXXXX
RPM: XX Angle: XXXXXXXX |
Pink noise is heard, moving clockwise around the room. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but remains primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but remains primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but remains primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but remains primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but remains primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
no
drop-outs
Active Object: XX |
Pink noise is heard. The noise maintains consistent loudness and timbre. The location of the noise may shift but should remain primarily on the screen. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
Note: authoring info located at the Beginning of the iaFrame ChildElements |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
Note: authoring info located at the End of the iaFrame ChildElements |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
Unknown element located at the beginning of the IAFrame |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
Unknown element located at the end of the IAFrame |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
User Data located at the Beginning of the iaFrame ChildElements |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
5.1
pink-noise
bed
User Data located at the End of the iaFrame ChildElements |
Pink noise is heard. The noise maintains consistent loudness, timbre and size. |
Text on screen | Expectations |
---|---|
You
should
hear
a
pink-noise
bed
Audio Description: AMBIENCE, EFFECTS |
Pink noise is heard. |
You
should
hear
a
pink-noise
bed
Audio Description: DIALOG, FOLEY, MUSIC |
Pink noise is heard. |
You
should
hear
a
pink-noise
bed
Audio Description: AMBIENCE, DIALOG |
Pink noise is heard. |
You
should
hear
a
pink-noise
bed
Audio Description: MUSIC |
Pink noise is heard. |
You
should
hear
a
pink-noise
bed
Audio Description: AMBIENCE, DIALOG, EFFECTS |
Pink noise is heard. |
You
should
hear
a
pink-noise
bed
Audio Description: ADDITIONAL, DIALOG Note: there is a custom Audio Description present |
Pink noise is heard. |
The noise maintains consistent loudness, timbre and size in each test. |
This appendix summarizes changes across releases of the CTP, which are listed in Table K.1 .
This appendix is informative and shall not be used to perform or interpret the contents of the CTP.
Version | Release date | Summary of changes |
---|---|---|
1.0 | Not applicable | |
1.1 | K.2. Changes prior to CTP 1.2.1 | |
1.2 | ||
1.2.1 | ||
1.3 | K.3. Changes between CTP 1.2.1 and CTP 1.3 | |
1.3.1 | K.4. Changes between CTP 1.3 and CTP 1.3.1 | |
1.3.2 | K.5. Changes between CTP 1.3.1 and CTP 1.3.2 | |
1.3.3 | K.6. Changes between CTP 1.3.2 and CTP 1.3.3 | |
1.4b | K.7. Changes between CTP 1.3.3 and CTP 1.4b | |
This version | K.8. Changes between CTP 1.4b and this version |
Changes to CTP releases prior to CTP 1.2.1 are described in the collection of documents titled Revision To DCI Digital Cinema System Specification Compliance Test Plan at https://www.dcimovies.com/ .