|
OpenQuizz
Une application de gestion des contenus pédagogiques
|
Data Structures | |
| class | DeltaChunk |
| class | DeltaChunkList |
| class | TopdownDeltaChunkList |
Functions | |
| def | delta_duplicate (src) |
| def | delta_chunk_apply (dc, bbuf, write) |
| def | delta_list_apply (dcl, bbuf, write) |
| def | delta_list_slice (dcl, absofs, size, ndcl) |
| def | is_loose_object (m) |
| def | loose_object_header_info (m) |
| def | pack_object_header_info (data) |
| def | create_pack_object_header (obj_type, obj_size) |
| def | msb_size (data, offset=0) |
| def | loose_object_header (type, size) |
| def | write_object (type, size, read, write, chunk_size=chunk_size) |
| def | stream_copy (read, write, size, chunk_size) |
| def | connect_deltas (dstreams) |
| def | apply_delta_data (src_buf, src_buf_size, delta_buf, delta_buf_size, write) |
| def | is_equal_canonical_sha (canonical_length, match, sha1) |
Variables | |
| decompressobj | |
| OFS_DELTA | |
| REF_DELTA | |
| delta_types | |
| type_id_to_type_map | |
| type_to_type_id_map | |
| chunk_size | |
| def gitdb.fun.apply_delta_data | ( | src_buf, | |
| src_buf_size, | |||
| delta_buf, | |||
| delta_buf_size, | |||
| write | |||
| ) |
Apply data from a delta buffer using a source buffer to the target file :param src_buf: random access data from which the delta was created :param src_buf_size: size of the source buffer in bytes :param delta_buf_size: size fo the delta buffer in bytes :param delta_buf: random access delta data :param write: write method taking a chunk of bytes **Note:** transcribed to python from the similar routine in patch-delta.c
| def gitdb.fun.connect_deltas | ( | dstreams | ) |
Read the condensed delta chunk information from dstream and merge its information
into a list of existing delta chunks
:param dstreams: iterable of delta stream objects, the delta to be applied last
comes first, then all its ancestors in order
:return: DeltaChunkList, containing all operations to apply
| def gitdb.fun.create_pack_object_header | ( | obj_type, | |
| obj_size | |||
| ) |
:return: string defining the pack header comprised of the object type
and its incompressed size in bytes
:param obj_type: pack type_id of the object
:param obj_size: uncompressed size in bytes of the following object stream
| def gitdb.fun.delta_chunk_apply | ( | dc, | |
| bbuf, | |||
| write | |||
| ) |
Apply own data to the target buffer :param bbuf: buffer providing source bytes for copy operations :param write: write method to call with data to write
| def gitdb.fun.delta_duplicate | ( | src | ) |
| def gitdb.fun.delta_list_apply | ( | dcl, | |
| bbuf, | |||
| write | |||
| ) |
Apply the chain's changes and write the final result using the passed
write function.
:param bbuf: base buffer containing the base of all deltas contained in this
list. It will only be used if the chunk in question does not have a base
chain.
:param write: function taking a string of bytes to write to the output
| def gitdb.fun.delta_list_slice | ( | dcl, | |
| absofs, | |||
| size, | |||
| ndcl | |||
| ) |
:return: Subsection of this list at the given absolute offset, with the given
size in bytes.
:return: None
| def gitdb.fun.is_equal_canonical_sha | ( | canonical_length, | |
| match, | |||
| sha1 | |||
| ) |
:return: True if the given lhs and rhs 20 byte binary shas
The comparison will take the canonical_length of the match sha into account,
hence the comparison will only use the last 4 bytes for uneven canonical representations
:param match: less than 20 byte sha
:param sha1: 20 byte sha
| def gitdb.fun.is_loose_object | ( | m | ) |
:return: True the file contained in memory map m appears to be a loose object.
Only the first two bytes are needed
| def gitdb.fun.loose_object_header | ( | type, | |
| size | |||
| ) |
:return: bytes representing the loose object header, which is immediately
followed by the content stream of size 'size'
| def gitdb.fun.loose_object_header_info | ( | m | ) |
:return: tuple(type_string, uncompressed_size_in_bytes) the type string of the
object as well as its uncompressed size in bytes.
:param m: memory map from which to read the compressed object data
| def gitdb.fun.msb_size | ( | data, | |
offset = 0 |
|||
| ) |
:return: tuple(read_bytes, size) read the msb size from the given random
access data starting at the given byte offset
| def gitdb.fun.pack_object_header_info | ( | data | ) |
:return: tuple(type_id, uncompressed_size_in_bytes, byte_offset)
The type_id should be interpreted according to the ``type_id_to_type_map`` map
The byte-offset specifies the start of the actual zlib compressed datastream
:param m: random-access memory, like a string or memory map
| def gitdb.fun.stream_copy | ( | read, | |
| write, | |||
| size, | |||
| chunk_size | |||
| ) |
Copy a stream up to size bytes using the provided read and write methods, in chunks of chunk_size **Note:** its much like stream_copy utility, but operates just using methods
| def gitdb.fun.write_object | ( | type, | |
| size, | |||
| read, | |||
| write, | |||
chunk_size = chunk_size |
|||
| ) |
Write the object as identified by type, size and source_stream into the
target_stream
:param type: type string of the object
:param size: amount of bytes to write from source_stream
:param read: read method of a stream providing the content data
:param write: write method of the output stream
:param close_target_stream: if True, the target stream will be closed when
the routine exits, even if an error is thrown
:return: The actual amount of bytes written to stream, which includes the header and a trailing newline
| chunk_size |
| decompressobj |
| delta_types |
| OFS_DELTA |
| REF_DELTA |
| type_id_to_type_map |
| type_to_type_id_map |