kopia lustrzana https://gitlab.com/jaywink/federation
Add audio and video media objects and much more.
rodzic
662e2964b6
commit
58c8f95e54
|
@ -1,7 +1,7 @@
|
|||
# This file is a template, and might need editing before it works on your project.
|
||||
# Official language image. Look for the different tagged releases at:
|
||||
# https://hub.docker.com/r/library/python/tags/
|
||||
image: python:3.8
|
||||
image: python:3.10
|
||||
|
||||
# Change pip's cache directory to be inside the project directory since we can
|
||||
# only cache local items.
|
||||
|
|
25
CHANGELOG.md
25
CHANGELOG.md
|
@ -1,5 +1,30 @@
|
|||
# Changelog
|
||||
|
||||
## [0.23.0] - unreleased
|
||||
|
||||
### Added
|
||||
|
||||
* Inbound Activitypub payloads are now processed by calamus (https://github.com/SwissDataScienceCenter/calamus),
|
||||
which is a jsonld processor based on marshmallow.
|
||||
|
||||
* For performance, requests_cache has been added. It pulls a redis configuration from django if one exists or
|
||||
falls back to a sqlite backend.
|
||||
|
||||
* GET requests are now signed if the django configuration includes FEDERATION_USER which is used to fetch that
|
||||
user's private key.
|
||||
|
||||
* Added Video and Audio objects. Inbound support only.
|
||||
|
||||
* Process Activitypub reply collections.
|
||||
|
||||
### Fixed
|
||||
|
||||
* Signatures are not verified and the corresponding payload is dropped if no public key is found.
|
||||
|
||||
### Internal changes
|
||||
|
||||
* Dropped python 3.6 support.
|
||||
|
||||
## [0.22.0] - 2021-08-15
|
||||
|
||||
### Added
|
||||
|
|
|
@ -9,7 +9,7 @@ Help is more than welcome to extend this library. Please see the following resou
|
|||
Environment setup
|
||||
-----------------
|
||||
|
||||
Once you have your (Python 3.6+) virtualenv set up, install the development requirements::
|
||||
Once you have your (Python 3.7+) virtualenv set up, install the development requirements::
|
||||
|
||||
pip install -r dev-requirements.txt
|
||||
|
||||
|
|
|
@ -14,9 +14,8 @@ Status
|
|||
Currently three protocols are being focused on.
|
||||
|
||||
* Diaspora is considered to be stable with most of the protocol implemented.
|
||||
* ActivityPub support should be considered as alpha - all the basic
|
||||
things work but there are likely to be a lot of compatibility issues with other ActivityPub
|
||||
implementations.
|
||||
* ActivityPub support should be considered as beta - inbound payload are
|
||||
handled by a jsonld processor (calamus)
|
||||
* Matrix support cannot be considered usable as of yet.
|
||||
|
||||
The code base is well tested and in use in several projects. Backward incompatible changes
|
||||
|
|
|
@ -48,9 +48,15 @@ Features currently supported:
|
|||
* Actor (Person outbound, Person, Organization, Service inbound)
|
||||
* Note, Article and Page (Create, Delete, Update)
|
||||
* These become a ``Post`` or ``Comment`` depending on ``inReplyTo``.
|
||||
* Attachment images from the above objects
|
||||
* Attachment images, (inbound only for audios and videos) from the above objects
|
||||
* Follow, Accept Follow, Undo Follow
|
||||
* Announce
|
||||
* Inbound Peertube Video objects translated as ``Post``.
|
||||
|
||||
* Inbound processing of reply collections, for platforms that implement it.
|
||||
* Link, Like, View, Signature, PropertyValue, IdentityProof and Emojis objects are only processed for inbound
|
||||
payloads currently. Outbound processing requires support by the client
|
||||
application.
|
||||
|
||||
Namespace
|
||||
.........
|
||||
|
@ -71,23 +77,26 @@ The following keys will be set on the entity based on the ``source`` property ex
|
|||
* ``_rendered_content`` will be the object ``content``
|
||||
* ``raw_content`` will object ``content`` run through a HTML2Markdown renderer
|
||||
|
||||
The ``contentMap`` property is processed but content language selection is not implemented yet.
|
||||
|
||||
For outbound entities, ``raw_content`` is expected to be in ``text/markdown``,
|
||||
specifically CommonMark. When sending payloads, ``raw_content`` will be rendered via
|
||||
the ``commonmark`` library into ``object.content``. The original ``raw_content``
|
||||
will be added to the ``object.source`` property.
|
||||
|
||||
Images
|
||||
Medias
|
||||
......
|
||||
|
||||
Any images referenced in the ``raw_content`` of outbound entities will be extracted
|
||||
into ``object.attachment`` objects, for receivers that don't support inline images.
|
||||
These attachments will have a ``pyfed:inlineImage`` property set to ``true`` to
|
||||
indicate the image has been extrated from the content. Receivers should ignore the
|
||||
into ``object.attachment`` object. For receivers that don't support inline images,
|
||||
image attachments will have a ``pyfed:inlineImage`` property set to ``true`` to
|
||||
indicate the image has been extracted from the content. Receivers should ignore the
|
||||
inline image attachments if they support showing ``<img>`` HTML tags or the markdown
|
||||
content in ``object.source``.
|
||||
content in ``object.source``. Outbound audio and video attachments currently lack
|
||||
support from client applications.
|
||||
|
||||
For inbound entities we do this automatically by not including received attachments in
|
||||
the entity ``_children`` attribute.
|
||||
For inbound entities we do this automatically by not including received image attachments in
|
||||
the entity ``_children`` attribute. Audio and video are passed through the client application.
|
||||
|
||||
.. _matrix:
|
||||
|
||||
|
|
|
@ -37,7 +37,7 @@ passed back to the caller.
|
|||
For sending messages out, either base or protocol specific entities can be passed
|
||||
to the outbound senders.
|
||||
|
||||
If you need the correct protocol speficic entity class from the base entity,
|
||||
If you need the correct protocol specific entity class from the base entity,
|
||||
each protocol will define a ``get_outbound_entity`` function.
|
||||
|
||||
.. autofunction:: federation.entities.activitypub.mappers.get_outbound_entity
|
||||
|
@ -212,6 +212,7 @@ Some settings need to be set in Django settings. An example is below:
|
|||
|
||||
FEDERATION = {
|
||||
"base_url": "https://myserver.domain.tld,
|
||||
"federation_id": "https://example.com/u/john/",
|
||||
"get_object_function": "myproject.utils.get_object",
|
||||
"get_private_key_function": "myproject.utils.get_private_key",
|
||||
"get_profile_function": "myproject.utils.get_profile",
|
||||
|
@ -223,6 +224,7 @@ Some settings need to be set in Django settings. An example is below:
|
|||
}
|
||||
|
||||
* ``base_url`` is the base URL of the server, ie protocol://domain.tld.
|
||||
* ``federation_id`` is a valid ActivityPub local profile id whose private key will be used to create the HTTP signature for GET requests to ActivityPub platforms.
|
||||
* ``get_object_function`` should be the full path to a function that will return the object matching the ActivityPub ID for the request object passed to this function.
|
||||
* ``get_private_key_function`` should be the full path to a function that will accept a federation ID (url, handle or guid) and return the private key of the user (as an RSA object). Required for example to sign outbound messages in some cases.
|
||||
* ``get_profile_function`` should be the full path to a function that should return a ``Profile`` entity. The function should take one or more keyword arguments: ``fid``, ``handle``, ``guid`` or ``request``. It should look up a profile with one or more of the provided parameters.
|
||||
|
|
|
@ -3,7 +3,7 @@ CONTEXT_DIASPORA = {"diaspora": "https://diasporafoundation.org/ns/"}
|
|||
CONTEXT_HASHTAG = {"Hashtag": "as:Hashtag"}
|
||||
CONTEXT_LD_SIGNATURES = "https://w3id.org/security/v1"
|
||||
CONTEXT_MANUALLY_APPROVES_FOLLOWERS = {"manuallyApprovesFollowers": "as:manuallyApprovesFollowers"}
|
||||
CONTEXT_PYTHON_FEDERATION = {"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"}
|
||||
CONTEXT_PYTHON_FEDERATION = {"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"}
|
||||
CONTEXT_SENSITIVE = {"sensitive": "as:sensitive"}
|
||||
|
||||
CONTEXTS_DEFAULT = [
|
||||
|
|
|
@ -8,7 +8,7 @@ from federation.entities.activitypub.constants import (
|
|||
CONTEXTS_DEFAULT, CONTEXT_MANUALLY_APPROVES_FOLLOWERS, CONTEXT_SENSITIVE, CONTEXT_HASHTAG,
|
||||
CONTEXT_LD_SIGNATURES, CONTEXT_DIASPORA)
|
||||
from federation.entities.activitypub.enums import ActorType, ObjectType, ActivityType
|
||||
from federation.entities.base import Profile, Post, Follow, Accept, Comment, Retraction, Share, Image
|
||||
from federation.entities.base import Profile, Post, Follow, Accept, Comment, Retraction, Share, Image, Audio, Video
|
||||
from federation.entities.mixins import RawContentMixin, BaseEntity, PublicMixin, CreatedAtMixin
|
||||
from federation.entities.utils import get_base_attributes
|
||||
from federation.outbound import handle_send
|
||||
|
@ -122,13 +122,13 @@ class ActivitypubNoteMixin(AttachImagesMixin, CleanContentMixin, PublicMixin, Cr
|
|||
Extract mentions from the source object.
|
||||
"""
|
||||
super().extract_mentions()
|
||||
if not isinstance(self._source_object, dict):
|
||||
return
|
||||
source = self._source_object.get('object') if isinstance(self._source_object.get('object'), dict) else \
|
||||
self._source_object
|
||||
for tag in source.get('tag', []):
|
||||
if tag.get('type') == "Mention" and tag.get('href'):
|
||||
self._mentions.add(tag.get('href'))
|
||||
|
||||
if getattr(self, 'tag_list', None):
|
||||
from federation.entities.activitypub.models import Mention # Circulars
|
||||
tag_list = self.tag_list if isinstance(self.tag_list, list) else [self.tag_list]
|
||||
for tag in tag_list:
|
||||
if isinstance(tag, Mention):
|
||||
self._mentions.add(tag.href)
|
||||
|
||||
def pre_send(self):
|
||||
super().pre_send()
|
||||
|
@ -196,6 +196,8 @@ class ActivitypubNoteMixin(AttachImagesMixin, CleanContentMixin, PublicMixin, Cr
|
|||
|
||||
|
||||
class ActivitypubComment(ActivitypubNoteMixin, Comment):
|
||||
entity_type = "Comment"
|
||||
|
||||
def to_as2(self) -> Dict:
|
||||
as2 = super().to_as2()
|
||||
as2["object"]["inReplyTo"] = self.target_id
|
||||
|
@ -210,17 +212,18 @@ class ActivitypubFollow(ActivitypubEntityMixin, Follow):
|
|||
Post receive hook - send back follow ack.
|
||||
"""
|
||||
super().post_receive()
|
||||
|
||||
if not self.following:
|
||||
return
|
||||
|
||||
from federation.utils.activitypub import retrieve_and_parse_profile # Circulars
|
||||
try:
|
||||
from federation.utils.django import get_function_from_config
|
||||
except ImportError:
|
||||
get_private_key_function = get_function_from_config("get_private_key_function")
|
||||
except (ImportError, AttributeError):
|
||||
logger.warning("ActivitypubFollow.post_receive - Unable to send automatic Accept back, only supported on "
|
||||
"Django currently")
|
||||
return
|
||||
get_private_key_function = get_function_from_config("get_private_key_function")
|
||||
key = get_private_key_function(self.target_id)
|
||||
if not key:
|
||||
logger.warning("ActivitypubFollow.post_receive - Failed to send automatic Accept back: could not find "
|
||||
|
@ -292,6 +295,11 @@ class ActivitypubImage(ActivitypubEntityMixin, Image):
|
|||
"pyfed:inlineImage": self.inline,
|
||||
}
|
||||
|
||||
class ActivitypubAudio(ActivitypubEntityMixin, Audio):
|
||||
pass
|
||||
|
||||
class ActivitypubVideo(ActivitypubEntityMixin, Video):
|
||||
pass
|
||||
|
||||
class ActivitypubPost(ActivitypubNoteMixin, Post):
|
||||
pass
|
||||
|
@ -301,6 +309,9 @@ class ActivitypubProfile(ActivitypubEntityMixin, Profile):
|
|||
_type = ActorType.PERSON.value
|
||||
public = True
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def to_as2(self) -> Dict:
|
||||
as2 = {
|
||||
"@context": CONTEXTS_DEFAULT + [
|
||||
|
|
|
@ -5,6 +5,7 @@ from federation.entities.activitypub.constants import NAMESPACE_PUBLIC
|
|||
from federation.entities.activitypub.entities import (
|
||||
ActivitypubFollow, ActivitypubProfile, ActivitypubAccept, ActivitypubPost, ActivitypubComment,
|
||||
ActivitypubRetraction, ActivitypubShare, ActivitypubImage)
|
||||
from federation.entities.activitypub.models import element_to_objects
|
||||
from federation.entities.base import Follow, Profile, Accept, Post, Comment, Retraction, Share, Image
|
||||
from federation.entities.mixins import BaseEntity
|
||||
from federation.types import UserType, ReceiverVariant
|
||||
|
@ -46,12 +47,13 @@ UNDO_MAPPINGS = {
|
|||
}
|
||||
|
||||
|
||||
def element_to_objects(payload: Dict) -> List:
|
||||
def element_to_objects_orig(payload: Dict) -> List:
|
||||
"""
|
||||
Transform an Element to a list of entities.
|
||||
"""
|
||||
cls = None
|
||||
entities = []
|
||||
|
||||
is_object = True if payload.get('type') in OBJECTS else False
|
||||
if payload.get('type') == "Delete":
|
||||
cls = ActivitypubRetraction
|
||||
|
@ -70,12 +72,6 @@ def element_to_objects(payload: Dict) -> List:
|
|||
|
||||
transformed = transform_attributes(payload, cls, is_object=is_object)
|
||||
entity = cls(**transformed)
|
||||
# Add protocol name
|
||||
entity._source_protocol = "activitypub"
|
||||
# Save element object to entity for possible later use
|
||||
entity._source_object = payload
|
||||
# Extract receivers
|
||||
entity._receivers = extract_receivers(payload)
|
||||
# Extract children
|
||||
if payload.get("object") and isinstance(payload.get("object"), dict):
|
||||
# Try object if exists
|
||||
|
@ -84,20 +80,6 @@ def element_to_objects(payload: Dict) -> List:
|
|||
# Try payload itself
|
||||
entity._children = extract_attachments(payload)
|
||||
|
||||
if hasattr(entity, "post_receive"):
|
||||
entity.post_receive()
|
||||
|
||||
try:
|
||||
entity.validate()
|
||||
except ValueError as ex:
|
||||
logger.error("Failed to validate entity %s: %s", entity, ex, extra={
|
||||
"transformed": transformed,
|
||||
})
|
||||
return []
|
||||
# Extract mentions
|
||||
if hasattr(entity, "extract_mentions"):
|
||||
entity.extract_mentions()
|
||||
|
||||
entities.append(entity)
|
||||
|
||||
return entities
|
||||
|
@ -126,50 +108,6 @@ def extract_attachments(payload: Dict) -> List[Image]:
|
|||
return attachments
|
||||
|
||||
|
||||
def extract_receiver(payload: Dict, receiver: str) -> Optional[UserType]:
|
||||
"""
|
||||
Transform a single receiver ID to a UserType.
|
||||
"""
|
||||
actor = payload.get("actor") or payload.get("attributedTo") or ""
|
||||
if receiver == NAMESPACE_PUBLIC:
|
||||
# Ignore since we already store "public" as a boolean on the entity
|
||||
return
|
||||
# Check for this being a list reference to followers of an actor?
|
||||
# TODO: terrible hack! the way some platforms deliver to sharedInbox using just
|
||||
# the followers collection as a target is annoying to us since we would have to
|
||||
# store the followers collection references on application side, which we don't
|
||||
# want to do since it would make application development another step more complex.
|
||||
# So for now we're going to do a terrible assumption that
|
||||
# 1) if "followers" in ID and
|
||||
# 2) if ID starts with actor ID
|
||||
# then; assume this is the followers collection of said actor ID.
|
||||
# When we have a caching system, just fetch each receiver and check what it is.
|
||||
# Without caching this would be too expensive to do.
|
||||
elif receiver.find("followers") > -1 and receiver.startswith(actor):
|
||||
return UserType(id=actor, receiver_variant=ReceiverVariant.FOLLOWERS)
|
||||
# Assume actor ID
|
||||
return UserType(id=receiver, receiver_variant=ReceiverVariant.ACTOR)
|
||||
|
||||
|
||||
def extract_receivers(payload: Dict) -> List[UserType]:
|
||||
"""
|
||||
Exctract receivers from a payload.
|
||||
"""
|
||||
receivers = []
|
||||
for key in ("to", "cc"):
|
||||
receiver = payload.get(key)
|
||||
if isinstance(receiver, list):
|
||||
for item in receiver:
|
||||
extracted = extract_receiver(payload, item)
|
||||
if extracted:
|
||||
receivers.append(extracted)
|
||||
elif isinstance(receiver, str):
|
||||
extracted = extract_receiver(payload, receiver)
|
||||
if extracted:
|
||||
receivers.append(extracted)
|
||||
return receivers
|
||||
|
||||
|
||||
def get_outbound_entity(entity: BaseEntity, private_key):
|
||||
"""Get the correct outbound entity for this protocol.
|
||||
|
||||
|
|
|
@ -0,0 +1,997 @@
|
|||
from copy import copy
|
||||
import json
|
||||
import logging
|
||||
from typing import List, Callable, Dict, Union, Optional
|
||||
|
||||
from calamus import fields
|
||||
from calamus.schema import JsonLDAnnotation, JsonLDSchema, JsonLDSchemaOpts
|
||||
from calamus.utils import normalize_value
|
||||
from marshmallow import exceptions, pre_load, post_load, pre_dump, post_dump
|
||||
from marshmallow.fields import Integer
|
||||
from marshmallow.utils import EXCLUDE
|
||||
from pyld import jsonld
|
||||
import requests_cache as rc
|
||||
|
||||
from federation.entities.activitypub.constants import NAMESPACE_PUBLIC
|
||||
from federation.entities.activitypub.entities import (
|
||||
ActivitypubAccept, ActivitypubPost, ActivitypubComment, ActivitypubProfile,
|
||||
ActivitypubImage, ActivitypubAudio, ActivitypubVideo, ActivitypubFollow,
|
||||
ActivitypubShare, ActivitypubRetraction)
|
||||
from federation.entities.mixins import BaseEntity
|
||||
from federation.types import UserType, ReceiverVariant
|
||||
from federation.utils.activitypub import retrieve_and_parse_document
|
||||
from federation.utils.text import with_slash, validate_handle
|
||||
|
||||
logger = logging.getLogger("federation")
|
||||
|
||||
|
||||
# This is required to workaround a bug in pyld that has the Accept header
|
||||
# accept other content types. From what I understand, precedence handling
|
||||
# is broken
|
||||
# from https://github.com/digitalbazaar/pyld/issues/133
|
||||
def get_loader(*args, **kwargs):
|
||||
# try to obtain redis config from django
|
||||
try:
|
||||
from federation.utils.django import get_configuration
|
||||
cfg = get_configuration()
|
||||
if cfg.get('redis'):
|
||||
backend = rc.RedisCache(namespace='fed_cache', **cfg['redis'])
|
||||
else:
|
||||
backend = rc.SQLiteCache(db_path='fed_cache')
|
||||
except ImportError:
|
||||
backend = rc.SQLiteCache(db_path='fed_cache')
|
||||
logger.debug('Using %s for requests_cache', type(backend))
|
||||
|
||||
requests_loader = jsonld.requests_document_loader(*args, **kwargs)
|
||||
|
||||
def loader(url, options={}):
|
||||
options['headers']['Accept'] = 'application/ld+json'
|
||||
with rc.enabled(cache_name='fed_cache', backend=backend):
|
||||
return requests_loader(url, options)
|
||||
|
||||
return loader
|
||||
|
||||
jsonld.set_document_loader(get_loader())
|
||||
|
||||
|
||||
class AddedSchemaOpts(JsonLDSchemaOpts):
|
||||
def __init__(self, meta, *args, **kwargs):
|
||||
super().__init__(meta, *args, **kwargs)
|
||||
self.inherit_parent_types = False
|
||||
self.unknown = EXCLUDE
|
||||
|
||||
JsonLDSchema.OPTIONS_CLASS = AddedSchemaOpts
|
||||
|
||||
|
||||
# Not sure how exhaustive this needs to be...
|
||||
as2 = fields.Namespace("https://www.w3.org/ns/activitystreams#")
|
||||
dc = fields.Namespace("http://purl.org/dc/terms/")
|
||||
diaspora = fields.Namespace("https://diasporafoundation.org/ns/")
|
||||
ldp = fields.Namespace("http://www.w3.org/ns/ldp#")
|
||||
litepub = fields.Namespace("http://litepub.social/ns#")
|
||||
misskey = fields.Namespace("https://misskey-hub.net/ns#")
|
||||
ostatus = fields.Namespace("http://ostatus.org#")
|
||||
pt = fields.Namespace("https://joinpeertube.org/ns#")
|
||||
pyfed = fields.Namespace("https://docs.jasonrobinson.me/ns/python-federation#")
|
||||
schema = fields.Namespace("http://schema.org#")
|
||||
sec = fields.Namespace("https://w3id.org/security#")
|
||||
toot = fields.Namespace("http://joinmastodon.org/ns#")
|
||||
vcard = fields.Namespace("http://www.w3.org/2006/vcard/ns#")
|
||||
xsd = fields.Namespace("http://www.w3.org/2001/XMLSchema#")
|
||||
zot = fields.Namespace("https://hubzilla.org/apschema#")
|
||||
|
||||
|
||||
# Maybe this is food for an issue with calamus. pyld expands IRIs in an array,
|
||||
# marshmallow then barfs with an invalid string value.
|
||||
# Workaround: get rid of the array.
|
||||
# Also, this implements the many attribute for IRI fields, sort of
|
||||
class IRI(fields.IRI):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.dump_derived = kwargs.get('dump_derived')
|
||||
|
||||
def _serialize(self, value, attr, data, **kwargs):
|
||||
if not value and isinstance(self.dump_derived, dict):
|
||||
fields = {f: getattr(data, f) for f in self.dump_derived['fields']}
|
||||
value = self.dump_derived['fmt'].format(**fields)
|
||||
|
||||
return super()._serialize(value, attr, data, **kwargs)
|
||||
|
||||
def _deserialize(self, value, attr, data, **kwargs):
|
||||
if isinstance(value, list) and len(value) == 0: return value
|
||||
value = normalize_value(value)
|
||||
if isinstance(value, list):
|
||||
# no call to super() in list comprehensions...
|
||||
ret = []
|
||||
for val in value:
|
||||
v = super()._deserialize(val, attr, data, **kwargs)
|
||||
ret.append(v)
|
||||
return ret
|
||||
|
||||
return super()._deserialize(value, attr, data, **kwargs)
|
||||
|
||||
|
||||
# Don't want expanded IRIs to be exposed as dict keys
|
||||
class Dict(fields.Dict):
|
||||
ctx = ["https://www.w3.org/ns/activitystreams", "https://w3id.org/security/v1"]
|
||||
|
||||
# may or may not be needed
|
||||
def _serialize(self, value, attr, obj, **kwargs):
|
||||
if isinstance(value, dict):
|
||||
value['@context'] = self.ctx
|
||||
value = jsonld.expand(value)[0]
|
||||
return super()._serialize(value, attr, obj, **kwargs)
|
||||
|
||||
def _deserialize(self, value, attr, data, **kwargs):
|
||||
# HACK: "promote" a Pleroma source field by adding content
|
||||
# and mediaType as2 properties
|
||||
if attr == str(as2.source):
|
||||
if isinstance(value, list) and str(as2.content) not in value[0].keys():
|
||||
value = [{str(as2.content): value, str(as2.mediaType): 'text/plain'}]
|
||||
ret = super()._deserialize(value, attr, data, **kwargs)
|
||||
ret = jsonld.compact(ret, self.ctx)
|
||||
ret.pop('@context')
|
||||
return ret
|
||||
|
||||
|
||||
# calamus sets a XMLSchema#integer type, but different definitions
|
||||
# maybe used, hence the flavor property
|
||||
# TODO: handle non negative types
|
||||
class Integer(fields._JsonLDField, Integer):
|
||||
flavor = None # add fields.IRIReference type hint
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.flavor = kwargs.get('flavor')
|
||||
|
||||
def _serialize(self, value, attr, obj, **kwargs):
|
||||
value = super()._serialize(value, attr, obj, **kwargs)
|
||||
flavor = str(self.flavor) if self.flavor else "http://www.w3.org/2001/XMLSchema#integer"
|
||||
if self.parent.opts.add_value_types or self.add_value_types:
|
||||
value = {"@value": value, "@type": flavor}
|
||||
return value
|
||||
|
||||
|
||||
# calamus doesn't implement json-ld langage maps
|
||||
class LanguageMap(Dict):
|
||||
def _serialize(self, value, attr, obj, **kwargs):
|
||||
ret = super()._serialize(value, attr, obj, **kwargs)
|
||||
if not ret: return ret
|
||||
value = []
|
||||
for k,v in ret.items():
|
||||
if k == 'orig':
|
||||
value.append({'@value':v})
|
||||
else:
|
||||
value.append({'@language': k, '@value':v})
|
||||
|
||||
return value
|
||||
|
||||
def _deserialize(self, value, attr, data, **kwargs):
|
||||
ret = {}
|
||||
for i,c in enumerate(value):
|
||||
lang = c.pop('@language', None)
|
||||
lang = '_:'+lang if lang else '_:orig'
|
||||
ret[lang] = [c]
|
||||
return super()._deserialize(ret, attr, data, **kwargs)
|
||||
|
||||
|
||||
class MixedField(fields.Nested):
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.iri = IRI(self.field_name, add_value_types=False)
|
||||
|
||||
def _bind_to_schema(self, field_name, schema):
|
||||
super()._bind_to_schema(field_name, schema)
|
||||
self.iri.parent = self.parent
|
||||
|
||||
def _serialize(self, value, attr, obj, **kwargs):
|
||||
if isinstance(value, str) or (
|
||||
isinstance(value, list) and len(value) > 0 and isinstance(value[0], str)):
|
||||
return self.iri._serialize(value, attr, obj, **kwargs)
|
||||
else:
|
||||
return super()._serialize(value, attr, obj, **kwargs)
|
||||
|
||||
def _deserialize(self, value, attr, data, **kwargs):
|
||||
# this is just so the ACTIVITYPUB_POST_OBJECT_IMAGES test payload passes
|
||||
if len(value) == 0: return value
|
||||
|
||||
if isinstance(value, list) and value[0] == {}: return {}
|
||||
|
||||
ret = []
|
||||
for item in value:
|
||||
if item.get('@type'):
|
||||
res = super()._deserialize(item, attr, data, **kwargs)
|
||||
ret.append(res)
|
||||
else:
|
||||
ret.append(self.iri._deserialize(item, attr, data, **kwargs))
|
||||
|
||||
return ret if len(ret) > 1 else ret[0]
|
||||
|
||||
|
||||
OBJECTS = [
|
||||
'AnnounceSchema',
|
||||
'ApplicationSchema',
|
||||
'ArticleSchema',
|
||||
'FollowSchema',
|
||||
'GroupSchema',
|
||||
'LikeSchema',
|
||||
'NoteSchema',
|
||||
'OrganizationSchema',
|
||||
'PageSchema',
|
||||
'PersonSchema',
|
||||
'ServiceSchema',
|
||||
'TombstoneSchema',
|
||||
'VideoSchema'
|
||||
]
|
||||
|
||||
|
||||
def set_public(entity):
|
||||
for attr in [getattr(entity, 'to', []), getattr(entity, 'cc' ,[])]:
|
||||
if isinstance(attr, list):
|
||||
if NAMESPACE_PUBLIC in attr: entity.public = True
|
||||
elif attr == NAMESPACE_PUBLIC: entity.public = True
|
||||
|
||||
|
||||
class Object(metaclass=JsonLDAnnotation):
|
||||
atom_url = fields.String(ostatus.atomUri)
|
||||
also_known_as = IRI(as2.alsoKnownAs)
|
||||
icon = MixedField(as2.icon, nested='ImageSchema')
|
||||
image = MixedField(as2.image, nested='ImageSchema')
|
||||
tag_list = MixedField(as2.tag, nested=['HashtagSchema','MentionSchema','PropertyValueSchema','EmojiSchema'])
|
||||
_children = fields.Nested(as2.attachment, nested=['ImageSchema', 'AudioSchema', 'DocumentSchema','PropertyValueSchema','IdentityProofSchema'], many=True)
|
||||
content_map = LanguageMap(as2.content) # language maps are not implemented in calamus
|
||||
context = IRI(as2.context)
|
||||
guid = fields.String(diaspora.guid)
|
||||
name = fields.String(as2.name)
|
||||
generator = MixedField(as2.generator, nested='ServiceSchema')
|
||||
created_at = fields.DateTime(as2.published, add_value_types=True)
|
||||
replies = MixedField(as2.replies, nested=['CollectionSchema','OrderedCollectionSchema'])
|
||||
signature = MixedField(sec.signature, nested = 'SignatureSchema')
|
||||
start_time = fields.DateTime(as2.startTime, add_value_types=True)
|
||||
updated = fields.DateTime(as2.updated, add_value_types=True)
|
||||
to = IRI(as2.to)
|
||||
cc = IRI(as2.cc)
|
||||
media_type = fields.String(as2.mediaType)
|
||||
sensitive = fields.Boolean(as2.sensitive)
|
||||
source = Dict(as2.source)
|
||||
|
||||
# The following properties are defined by some platforms, but are not implemented yet
|
||||
#audience
|
||||
#endtime
|
||||
#location
|
||||
#preview
|
||||
#bto
|
||||
#bcc
|
||||
#duration
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
for k, v in kwargs.items():
|
||||
if hasattr(self, k):
|
||||
setattr(self, k, v)
|
||||
self.has_schema = True
|
||||
|
||||
# noop to avoid isinstance tests
|
||||
def to_base(self):
|
||||
return self
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Object
|
||||
|
||||
@pre_load
|
||||
def update_context(self, data, **kwargs):
|
||||
if not data.get('@context'): return data
|
||||
ctx = copy(data['@context'])
|
||||
|
||||
# add a # at the end of the python-federation string
|
||||
# for socialhome payloads
|
||||
s = json.dumps(ctx)
|
||||
if 'python-federation"' in s:
|
||||
ctx = json.loads(s.replace('python-federation', 'python-federation#', 1))
|
||||
|
||||
# gotosocial has http://joinmastodon.com/ns in @context. This
|
||||
# is not a json-ld document.
|
||||
try:
|
||||
ctx.pop(ctx.index('http://joinmastodon.org/ns'))
|
||||
except:
|
||||
pass
|
||||
|
||||
# remove @language in context since this directive is not
|
||||
# processed by calamus. Pleroma adds a useless @language: 'und'
|
||||
# which is discouraged in best practices and in some cases makes
|
||||
# calamus return dict where str is expected.
|
||||
# see https://www.rfc-editor.org/rfc/rfc5646, page 56
|
||||
idx = []
|
||||
for i,v in enumerate(ctx):
|
||||
if isinstance(v, dict):
|
||||
v.pop('@language',None)
|
||||
if len(v) == 0: idx.insert(0, i)
|
||||
for i in idx: ctx.pop(i)
|
||||
|
||||
# AP activities may be signed, but most platforms don't
|
||||
# define RsaSignature2017. add it to the context
|
||||
# hubzilla doesn't define the discoverable property in its context
|
||||
may_add = {'signature': ['https://w3id.org/security/v1', {'sec':'https://w3id.org/security#','RsaSignature2017':'sec:RsaSignature2017'}],
|
||||
'discoverable': [{'toot':'http://joinmastodon.org/ns#','discoverable': 'toot:discoverable'}], #for hubzilla
|
||||
'copiedTo': [{'toot':'http://joinmastodon.org/ns#','copiedTo': 'toot:copiedTo'}], #for hubzilla
|
||||
'featured': [{'toot':'http://joinmastodon.org/ns#','featured': 'toot:featured'}], #for litepub and pleroma
|
||||
'tag': [{'Hashtag': 'as:Hashtag'}] #for epicyon
|
||||
}
|
||||
|
||||
to_add = [val for key,val in may_add.items() if data.get(key)]
|
||||
if to_add:
|
||||
idx = [i for i,v in enumerate(ctx) if isinstance(v, dict)]
|
||||
if idx:
|
||||
upd = ctx[idx[0]]
|
||||
# merge context dicts
|
||||
if len(idx) > 1:
|
||||
idx.reverse()
|
||||
for i in idx[:-1]:
|
||||
upd.update(ctx[i])
|
||||
ctx.pop(i)
|
||||
else:
|
||||
upd = {}
|
||||
|
||||
for add in to_add:
|
||||
for val in add:
|
||||
if isinstance(val, str) and val not in ctx:
|
||||
try:
|
||||
ctx.append(val)
|
||||
except AttributeError:
|
||||
ctx = [ctx, val]
|
||||
if isinstance(val, dict):
|
||||
upd.update(val)
|
||||
if not idx and upd: ctx.append(upd)
|
||||
|
||||
data['@context'] = ctx
|
||||
return data
|
||||
|
||||
# A node without an id isn't true json-ld, but many payloads have
|
||||
# id-less nodes. Since calamus forces random ids on such nodes,
|
||||
# this removes it.
|
||||
@post_dump
|
||||
def noid(self, data, **kwargs):
|
||||
if data['@id'].startswith('_:'): data.pop('@id')
|
||||
return data
|
||||
|
||||
|
||||
class Home(metaclass=JsonLDAnnotation):
|
||||
country_name = fields.String(fields.IRIReference("http://www.w3.org/2006/vcard/ns#","country-name"))
|
||||
region = fields.String(vcard.region)
|
||||
locality = fields.String(vcard.locality)
|
||||
|
||||
class Meta:
|
||||
rdf_type = vcard.Home
|
||||
|
||||
|
||||
class List(fields.List):
|
||||
def _deserialize(self,value, attr, data, **kwargs):
|
||||
value = normalize_value(value)
|
||||
return super()._deserialize(value,attr,data,**kwargs)
|
||||
|
||||
|
||||
class Collection(Object):
|
||||
id = fields.Id()
|
||||
items = MixedField(as2.items, nested=OBJECTS)
|
||||
first = MixedField(as2.first, nested=['CollectionPageSchema', 'OrderedCollectionPageSchema'])
|
||||
current = IRI(as2.current)
|
||||
last = IRI(as2.last)
|
||||
total_items = Integer(as2.totalItems, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Collection
|
||||
|
||||
|
||||
class OrderedCollection(Collection):
|
||||
items = List(as2.items, cls_or_instance=MixedField(as2.items, nested=OBJECTS))
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.OrderedCollection
|
||||
|
||||
|
||||
class CollectionPage(Collection):
|
||||
part_of = IRI(as2.partOf)
|
||||
next_ = IRI(as2.next)
|
||||
prev = IRI(as2.prev)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.CollectionPage
|
||||
|
||||
|
||||
class OrderedCollectionPage(OrderedCollection, CollectionPage):
|
||||
start_index = Integer(as2.startIndex, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.OrderedCollectionPage
|
||||
|
||||
|
||||
# This mimics that federation currently handles AP Document as AP Image
|
||||
# AP defines [Ii]mage and [Aa]udio objects/properties, but only a Video object
|
||||
# seen with Peertube payloads only so far
|
||||
class Document(Object):
|
||||
inline = fields.Boolean(pyfed.inlineImage)
|
||||
height = Integer(as2.height, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
width = Integer(as2.width, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
blurhash = fields.String(toot.blurhash)
|
||||
url = MixedField(as2.url, nested='LinkSchema')
|
||||
|
||||
def to_base(self):
|
||||
if self.media_type.startswith('image'):
|
||||
return ActivitypubImage(**self.__dict__)
|
||||
if self.media_type.startswith('audio'):
|
||||
return ActivitypubAudio(**self.__dict__)
|
||||
if self.media_type.startswith('video'):
|
||||
return ActivitypubVideo(**self.__dict__)
|
||||
return self # what was that?
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Document
|
||||
|
||||
|
||||
class Image(Document):
|
||||
@classmethod
|
||||
def from_base(cls, entity):
|
||||
return cls(**entity.__dict__)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Image
|
||||
|
||||
# haven't seen this one so far..
|
||||
class Audio(Document):
|
||||
@classmethod
|
||||
def from_base(cls, entity):
|
||||
return cls(**entity.__dict__)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Audio
|
||||
|
||||
class Infohash(Object):
|
||||
name = fields.String(as2.name)
|
||||
|
||||
class Meta:
|
||||
rdf_type = pt.Infohash
|
||||
|
||||
|
||||
class Link(metaclass=JsonLDAnnotation):
|
||||
href = IRI(as2.href)
|
||||
rel = fields.List(as2.rel, cls_or_instance=fields.String(as2.rel))
|
||||
media_type = fields.String(as2.mediaType)
|
||||
name = fields.String(as2.name)
|
||||
href_lang = fields.String(as2.hrefLang)
|
||||
height = Integer(as2.height, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
width = Integer(as2.width, flavor=xsd.nonNegativeInteger, add_value_types=True)
|
||||
fps = Integer(pt.fps, flavor=schema.Number, add_value_types=True)
|
||||
size = Integer(pt.size, flavor=schema.Number, add_value_types=True)
|
||||
tag = MixedField(as2.tag, nested=['InfohashSchema', 'LinkSchema'])
|
||||
# Not implemented yet
|
||||
#preview : variable type?
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Link
|
||||
|
||||
@post_load
|
||||
def make_instance(self, data, **kwargs):
|
||||
data.pop('@id', None)
|
||||
return super().make_instance(data, **kwargs)
|
||||
|
||||
|
||||
class Hashtag(Link):
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Hashtag
|
||||
|
||||
|
||||
class Mention(Link):
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Mention
|
||||
|
||||
|
||||
class PropertyValue(Object):
|
||||
name = fields.String(as2.name)
|
||||
value = fields.String(schema.value)
|
||||
|
||||
class Meta:
|
||||
rdf_type = schema.PropertyValue
|
||||
|
||||
|
||||
class IdentityProof(Object):
|
||||
signature_value = fields.String(sec.signatureValue)
|
||||
signing_algorithm = fields.String(sec.signingAlgorithm)
|
||||
|
||||
class Meta:
|
||||
rdf_type = toot.IdentityProof
|
||||
|
||||
|
||||
class Emoji(Object):
|
||||
|
||||
class Meta:
|
||||
rdf_type = toot.Emoji
|
||||
|
||||
|
||||
class Person(Object):
|
||||
id = fields.Id()
|
||||
inbox = IRI(ldp.inbox)
|
||||
outbox = IRI(as2.outbox, dump_derived={'fmt': '{id}outbox/', 'fields': ['id']})
|
||||
following = IRI(as2.following, dump_derived={'fmt': '{id}following/', 'fields': ['id']})
|
||||
followers = IRI(as2.followers, dump_derived={'fmt': '{id}followers/', 'fields': ['id']})
|
||||
username = fields.String(as2.preferredUsername)
|
||||
endpoints = Dict(as2.endpoints)
|
||||
shared_inbox = IRI(as2.sharedInbox) # misskey adds this
|
||||
url = IRI(as2.url)
|
||||
playlists = IRI(pt.playlists)
|
||||
featured = IRI(toot.featured)
|
||||
featuredTags = IRI(toot.featuredTags)
|
||||
manuallyApprovesFollowers = fields.Boolean(as2.manuallyApprovesFollowers, default=False)
|
||||
discoverable = fields.Boolean(toot.discoverable)
|
||||
devices = IRI(toot.devices)
|
||||
public_key_dict = Dict(sec.publicKey)
|
||||
guid = fields.String(diaspora.guid)
|
||||
handle = fields.String(diaspora.handle)
|
||||
raw_content = fields.String(as2.summary)
|
||||
has_address = MixedField(vcard.hasAddress, nested='HomeSchema')
|
||||
has_instant_message = fields.List(vcard.hasInstantMessage, cls_or_instance=fields.String)
|
||||
address = fields.String(vcard.Address)
|
||||
is_cat = fields.Boolean(misskey.isCat)
|
||||
moved_to = IRI(as2.movedTo)
|
||||
copied_to = IRI(toot.copiedTo)
|
||||
capabilities = Dict(litepub.capabilities)
|
||||
suspended = fields.Boolean(toot.suspended)
|
||||
# Not implemented yet
|
||||
#liked is a collection
|
||||
#streams
|
||||
#proxyUrl
|
||||
#oauthAuthorizationEndpoint
|
||||
#oauthTokenEndpoint
|
||||
#provideClientKey
|
||||
#signClientKey
|
||||
|
||||
@classmethod
|
||||
def from_base(cls, entity):
|
||||
ret = cls(**entity.__dict__)
|
||||
if not hasattr(entity, 'inboxes'): return ret
|
||||
|
||||
ret.inbox = entity.inboxes["private"]
|
||||
ret.outbox = f"{with_slash(ret.id)}outbox/"
|
||||
ret.followers = f"{with_slash(ret.id)}followers/"
|
||||
ret.following = f"{with_slash(ret.id)}following/"
|
||||
ret.endpoints = {'sharedInbox': entity.inboxes["public"]}
|
||||
ret.public_key_dict = {
|
||||
"id": f"{ret.id}#main-key",
|
||||
"owner": ret.id,
|
||||
"publicKeyPem": entity.public_key
|
||||
}
|
||||
if entity.image_urls.get('large'):
|
||||
try:
|
||||
profile_icon = ActivitypubImage(url=entity.image_urls.get('large'))
|
||||
if profile_icon.media_type:
|
||||
ret.icon = [Image.from_base(profile_icon)]
|
||||
except Exception as ex:
|
||||
logger.warning("ActivitypubProfile.to_as2 - failed to set profile icon: %s", ex)
|
||||
|
||||
return ret
|
||||
|
||||
def to_base(self):
|
||||
entity = ActivitypubProfile(**self.__dict__)
|
||||
entity.inboxes = {
|
||||
'private': getattr(self, 'inbox', None),
|
||||
'public': None
|
||||
}
|
||||
if hasattr(self, 'endpoints') and isinstance(self.endpoints, dict):
|
||||
entity.inboxes['public'] = self.endpoints.get('sharedInbox', None)
|
||||
else:
|
||||
entity.inboxes['public'] = getattr(self,'shared_inbox',None)
|
||||
if hasattr(self, 'public_key_dict') and isinstance(self.public_key_dict, dict):
|
||||
entity.public_key = self.public_key_dict.get('publicKeyPem', None)
|
||||
if getattr(self, 'icon', None):
|
||||
icon = self.icon if not isinstance(self.icon, list) else self.icon[0]
|
||||
entity.image_urls = {
|
||||
'small': icon.url,
|
||||
'medium': icon.url,
|
||||
'large': icon.url
|
||||
}
|
||||
|
||||
entity._allowed_children += (PropertyValue, IdentityProof)
|
||||
|
||||
set_public(entity)
|
||||
return entity
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Person
|
||||
|
||||
|
||||
class Group(Person):
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Group
|
||||
|
||||
|
||||
class Application(Person):
|
||||
class Meta:
|
||||
rdf_type = as2.Application
|
||||
|
||||
|
||||
class Organization(Person):
|
||||
class Meta:
|
||||
rdf_type = as2.Organization
|
||||
|
||||
|
||||
class Service(Person):
|
||||
class Meta:
|
||||
rdf_type = as2.Service
|
||||
|
||||
|
||||
# The to_base method is used to handle cases where an AP object type matches multiple
|
||||
# classes depending on the existence/value of specific propertie(s) or
|
||||
# when the same class is used both as an object or an activity or
|
||||
# when a property can't be directly deserialized from the payload.
|
||||
# calamus Nested field can't handle using the same model
|
||||
# or the same type in multiple schemas
|
||||
class Note(Object):
|
||||
id = fields.Id()
|
||||
actor_id = IRI(as2.attributedTo)
|
||||
target_id = IRI(as2.inReplyTo)
|
||||
conversation = fields.RawJsonLD(ostatus.conversation)
|
||||
in_reply_to_atom_uri = IRI(ostatus.inReplyToAtomUri)
|
||||
summary = fields.String(as2.summary)
|
||||
url = IRI(as2.url)
|
||||
|
||||
def to_base(self):
|
||||
entity = ActivitypubComment(**self.__dict__) if getattr(self, 'target_id') else ActivitypubPost(**self.__dict__)
|
||||
|
||||
if hasattr(self, 'content_map'):
|
||||
orig = self.content_map.pop('orig')
|
||||
if len(self.content_map.keys()) > 1:
|
||||
logger.warning('Language selection not implemented, falling back to default')
|
||||
entity._rendered_content = orig.strip()
|
||||
else:
|
||||
entity._rendered_content = orig.strip() if len(self.content_map.keys()) == 0 else next(iter(self.content_map.values())).strip()
|
||||
|
||||
if getattr(self, 'source') and self.source.get('mediaType') == 'text/markdown':
|
||||
entity._media_type = self.source['mediaType']
|
||||
entity.raw_content = self.source.get('content').strip()
|
||||
else:
|
||||
entity._media_type = 'text/html'
|
||||
entity.raw_content = entity._rendered_content
|
||||
# to allow for posts/replies with medias only.
|
||||
if not entity.raw_content: entity.raw_content = "<div></div>"
|
||||
|
||||
if isinstance(getattr(entity, '_children', None), list):
|
||||
children = []
|
||||
for child in entity._children:
|
||||
img = child.to_base()
|
||||
if img:
|
||||
if isinstance(img, ActivitypubImage) and img.inline:
|
||||
continue
|
||||
children.append(img)
|
||||
entity._children = children
|
||||
|
||||
entity._allowed_children += (ActivitypubAudio, ActivitypubVideo)
|
||||
|
||||
set_public(entity)
|
||||
return entity
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Note
|
||||
|
||||
|
||||
class Article(Note):
|
||||
class Meta:
|
||||
rdf_type = as2.Article
|
||||
|
||||
|
||||
class Page(Note):
|
||||
class Meta:
|
||||
rdf_type = as2.Page
|
||||
|
||||
|
||||
# peertube uses a lot of properties differently...
|
||||
class Video(Object):
|
||||
id = fields.Id()
|
||||
actor_id = MixedField(as2.attributedTo, nested=['PersonSchema', 'GroupSchema'])
|
||||
url = MixedField(as2.url, nested='LinkSchema')
|
||||
|
||||
class Meta:
|
||||
unknown = EXCLUDE # required until all the pt fields are defined
|
||||
rdf_type = as2.Video
|
||||
|
||||
def to_base(self):
|
||||
"""Turn Peertube Video object into a Post
|
||||
Currently assumes Video objects with a content_map
|
||||
come from Peertube, but that's a bit weak
|
||||
"""
|
||||
|
||||
if hasattr(self, 'content_map'):
|
||||
text = self.content_map['orig']
|
||||
if getattr(self, 'media_type', None) == 'text/markdown':
|
||||
url = ""
|
||||
for u in self.url:
|
||||
if getattr(u, 'media_type', None) == 'text/html':
|
||||
url = u.href
|
||||
break
|
||||
text = f'[{self.name}]({url})\n\n'+text
|
||||
self.raw_content = text.strip()
|
||||
self._media_type = self.media_type
|
||||
|
||||
if hasattr(self, 'actor_id'):
|
||||
act = self.actor_id
|
||||
new_act = []
|
||||
if not isinstance(act, list): act = [act]
|
||||
for a in act:
|
||||
if type(a) == Person:
|
||||
new_act.append(a.id)
|
||||
# TODO: fix extract_receivers which doesn't handle multiple actors!
|
||||
self.actor_id = new_act[0]
|
||||
|
||||
entity = ActivitypubPost(**self.__dict__)
|
||||
set_public(entity)
|
||||
return entity
|
||||
#Some Video object
|
||||
else:
|
||||
return ActivitypubVideo(**self.__dict__)
|
||||
|
||||
|
||||
class Signature(Object):
|
||||
created = fields.DateTime(dc.created, add_value_types=True)
|
||||
creator = IRI(dc.creator)
|
||||
key = fields.String(sec.signatureValue)
|
||||
nonce = fields.String(sec.nonce)
|
||||
|
||||
class Meta:
|
||||
rdf_type = sec.RsaSignature2017
|
||||
|
||||
|
||||
class Activity(Object):
|
||||
actor_id = IRI(as2.actor)
|
||||
instrument = MixedField(as2.instrument, nested='ServiceSchema')
|
||||
# Not implemented yet
|
||||
#result
|
||||
#origin
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.activity = self
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Activity
|
||||
|
||||
|
||||
class Follow(Activity):
|
||||
activity_id = fields.Id()
|
||||
target_id = IRI(as2.object)
|
||||
|
||||
def to_base(self):
|
||||
entity = ActivitypubFollow(**self.__dict__)
|
||||
# This is assuming Follow can only be the object of an Undo activity. Lazy.
|
||||
if self.activity != self:
|
||||
entity.following = False
|
||||
|
||||
return entity
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Follow
|
||||
|
||||
|
||||
class Announce(Activity):
|
||||
id = fields.Id()
|
||||
target_id = IRI(as2.object)
|
||||
|
||||
def to_base(self):
|
||||
|
||||
if self.activity == self:
|
||||
entity = ActivitypubShare(**self.__dict__)
|
||||
else:
|
||||
self.target_id = self.id
|
||||
self.entity_type = 'Object'
|
||||
entity = ActivitypubRetraction(**self.__dict__)
|
||||
|
||||
set_public(entity)
|
||||
return entity
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Announce
|
||||
|
||||
|
||||
class Tombstone(Object):
|
||||
target_id = fields.Id()
|
||||
|
||||
def to_base(self):
|
||||
if self.activity != self: self.actor_id = self.activity.actor_id
|
||||
self.entity_type = 'Object'
|
||||
return ActivitypubRetraction(**self.__dict__)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Tombstone
|
||||
|
||||
|
||||
class Create(Activity):
|
||||
activity_id = fields.Id()
|
||||
object_ = MixedField(as2.object, nested=OBJECTS)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Create
|
||||
|
||||
|
||||
class Like(Announce):
|
||||
like = fields.String(diaspora.like)
|
||||
|
||||
def to_base(self):
|
||||
return self
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Like
|
||||
|
||||
|
||||
# inbound Accept is a noop...
|
||||
class Accept(Create):
|
||||
def to_base(self):
|
||||
del self.object_
|
||||
return ActivitypubAccept(**self.__dict__)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Accept
|
||||
|
||||
|
||||
class Delete(Create):
|
||||
def to_base(self):
|
||||
if hasattr(self, 'object_') and not isinstance(self.object_, Tombstone):
|
||||
self.target_id = self.object_
|
||||
self.entity_type = 'Object'
|
||||
return ActivitypubRetraction(**self.__dict__)
|
||||
|
||||
class Meta:
|
||||
rdf_type = as2.Delete
|
||||
|
||||
|
||||
class Update(Create):
|
||||
class Meta:
|
||||
rdf_type = as2.Update
|
||||
|
||||
|
||||
class Undo(Create):
|
||||
class Meta:
|
||||
rdf_type = as2.Undo
|
||||
|
||||
|
||||
class View(Create):
|
||||
class Meta:
|
||||
rdf_type = as2.View
|
||||
|
||||
|
||||
def process_followers(obj, base_url):
|
||||
pass
|
||||
|
||||
def extract_receiver(entity, receiver):
|
||||
"""
|
||||
Transform a single receiver ID to a UserType.
|
||||
"""
|
||||
|
||||
if receiver == NAMESPACE_PUBLIC:
|
||||
# Ignore since we already store "public" as a boolean on the entity
|
||||
return []
|
||||
|
||||
|
||||
# Work in progress
|
||||
#obj = retrieve_and_parse_document(receiver)
|
||||
#if isinstance(obj, ActivitypubProfile):
|
||||
# return [UserType(id=receiver, receiver_variant=ReceiverVariant.ACTOR)]
|
||||
#if isinstance(obj, Collection) and base_url:
|
||||
# return process_followers(obj, base_url)
|
||||
|
||||
|
||||
actor = getattr(entity, 'actor_id', None) or ""
|
||||
# Check for this being a list reference to followers of an actor?
|
||||
# TODO: terrible hack! the way some platforms deliver to sharedInbox using just
|
||||
# the followers collection as a target is annoying to us since we would have to
|
||||
# store the followers collection references on application side, which we don't
|
||||
# want to do since it would make application development another step more complex.
|
||||
# So for now we're going to do a terrible assumption that
|
||||
# 1) if "followers" in ID and
|
||||
# 2) if ID starts with actor ID
|
||||
# then; assume this is the followers collection of said actor ID.
|
||||
# When we have a caching system, just fetch each receiver and check what it is.
|
||||
# Without caching this would be too expensive to do.
|
||||
if receiver.find("followers") > -1 and receiver.startswith(actor):
|
||||
return [UserType(id=actor, receiver_variant=ReceiverVariant.FOLLOWERS)]
|
||||
# Assume actor ID
|
||||
return [UserType(id=receiver, receiver_variant=ReceiverVariant.ACTOR)]
|
||||
|
||||
|
||||
def extract_receivers(entity):
|
||||
"""
|
||||
Extract receivers from a payload.
|
||||
"""
|
||||
receivers = []
|
||||
for attr in ("to", "cc"):
|
||||
receiver = getattr(entity, attr, None)
|
||||
if isinstance(receiver, list):
|
||||
for item in receiver:
|
||||
extracted = extract_receiver(entity, item)
|
||||
if extracted:
|
||||
receivers += extracted
|
||||
elif isinstance(receiver, str):
|
||||
extracted = extract_receiver(entity, receiver)
|
||||
if extracted:
|
||||
receivers += extracted
|
||||
return receivers
|
||||
|
||||
|
||||
def extract_and_validate(entity):
|
||||
# Add protocol name
|
||||
entity._source_protocol = "activitypub"
|
||||
# Extract receivers
|
||||
entity._receivers = extract_receivers(entity)
|
||||
if hasattr(entity, "post_receive"):
|
||||
entity.post_receive()
|
||||
|
||||
if hasattr(entity, 'validate'): entity.validate()
|
||||
|
||||
# Extract mentions
|
||||
if hasattr(entity, "extract_mentions"):
|
||||
entity.extract_mentions()
|
||||
|
||||
# Extract reply ids
|
||||
if getattr(entity, 'replies', None):
|
||||
entity._replies = extract_reply_ids(getattr(entity.replies, 'first', []))
|
||||
|
||||
|
||||
|
||||
def extract_reply_ids(replies, visited=[]):
|
||||
objs = []
|
||||
items = getattr(replies, 'items', [])
|
||||
if items and not isinstance(items, list): items = [items]
|
||||
for item in items:
|
||||
if isinstance(item, Object):
|
||||
objs.append(item.id)
|
||||
else:
|
||||
objs.append(item)
|
||||
if hasattr(replies, 'next_'):
|
||||
if replies.next_ and (replies.id != replies.next_) and (replies.next_ not in visited):
|
||||
resp = retrieve_and_parse_document(replies.next_)
|
||||
if resp:
|
||||
visited.append(replies.next_)
|
||||
objs += extract_reply_ids(resp, visited)
|
||||
return objs
|
||||
|
||||
|
||||
def element_to_objects(element: Union[Dict, Object]) -> List:
|
||||
"""
|
||||
Transform an Element to a list of entities.
|
||||
"""
|
||||
|
||||
# json-ld handling with calamus
|
||||
# Skips unimplemented payloads
|
||||
# TODO: remove unused code
|
||||
entity = model_to_objects(element) if not isinstance(element, Object) else element
|
||||
if entity: entity = entity.to_base()
|
||||
if entity and isinstance(entity, BaseEntity):
|
||||
logger.info('Entity type "%s" was handled through the json-ld processor', entity.__class__.__name__)
|
||||
try:
|
||||
extract_and_validate(entity)
|
||||
except ValueError as ex:
|
||||
logger.error("Failed to validate entity %s: %s", entity, ex)
|
||||
return None
|
||||
return [entity]
|
||||
elif entity:
|
||||
logger.info('Entity type "%s" was handled through the json-ld processor but is not a base entity', entity.__class__.__name__)
|
||||
entity._receivers = extract_receivers(entity)
|
||||
return [entity]
|
||||
else:
|
||||
logger.warning("Payload not implemented by the json-ld processor, skipping")
|
||||
return []
|
||||
|
||||
|
||||
def model_to_objects(payload):
|
||||
model = globals().get(payload.get('type'))
|
||||
if model and issubclass(model, Object):
|
||||
try:
|
||||
entity = model.schema().load(payload)
|
||||
except (KeyError, jsonld.JsonLdError, exceptions.ValidationError) as exc : # Just give up for now. This must be made robust
|
||||
logger.error(f"Error parsing jsonld payload ({exc})")
|
||||
return None
|
||||
|
||||
if isinstance(getattr(entity, 'object_', None), Object):
|
||||
entity.object_.activity = entity
|
||||
entity = entity.object_
|
||||
|
||||
return entity
|
||||
return None
|
|
@ -1,4 +1,5 @@
|
|||
from typing import Dict, Tuple
|
||||
from mimetypes import guess_type
|
||||
|
||||
from dirty_validators.basic import Email
|
||||
|
||||
|
@ -43,12 +44,18 @@ class Image(OptionalRawContentMixin, CreatedAtMixin, BaseEntity):
|
|||
self.media_type = self.get_media_type()
|
||||
|
||||
def get_media_type(self) -> str:
|
||||
media_type = fetch_content_type(self.url)
|
||||
media_type = guess_type(self.url)[0] or fetch_content_type(self.url)
|
||||
if media_type in self._valid_media_types:
|
||||
return media_type
|
||||
return ""
|
||||
|
||||
|
||||
class Audio(OptionalRawContentMixin, CreatedAtMixin, BaseEntity):
|
||||
pass
|
||||
|
||||
class Video(OptionalRawContentMixin, CreatedAtMixin, BaseEntity):
|
||||
pass
|
||||
|
||||
class Comment(RawContentMixin, ParticipationMixin, CreatedAtMixin, RootTargetIDMixin, BaseEntity):
|
||||
"""Represents a comment, linked to another object."""
|
||||
participation = "comment"
|
||||
|
|
|
@ -37,13 +37,20 @@ class BaseEntity:
|
|||
self._children = []
|
||||
self._mentions = set()
|
||||
self._receivers = []
|
||||
for key, value in kwargs.items():
|
||||
if hasattr(self, key):
|
||||
|
||||
# make the assumption that if a schema is being used, the payload
|
||||
# is deserialized and validated properly
|
||||
if kwargs.get('has_schema'):
|
||||
for key, value in kwargs.items():
|
||||
setattr(self, key, value)
|
||||
else:
|
||||
warnings.warn("%s.__init__ got parameter %s which this class does not support - ignoring." % (
|
||||
self.__class__.__name__, key
|
||||
))
|
||||
else:
|
||||
for key, value in kwargs.items():
|
||||
if hasattr(self, key):
|
||||
setattr(self, key, value)
|
||||
else:
|
||||
warnings.warn("%s.__init__ got parameter %s which this class does not support - ignoring." % (
|
||||
self.__class__.__name__, key
|
||||
))
|
||||
if not self.activity:
|
||||
# Fill a default activity if not given and type of entity class has one
|
||||
self.activity = getattr(self, "_default_activity", None)
|
||||
|
@ -228,8 +235,9 @@ class RawContentMixin(BaseEntity):
|
|||
config = get_configuration()
|
||||
if config["tags_path"]:
|
||||
def linkifier(tag: str) -> str:
|
||||
return f'<a href="{config["base_url"]}{config["tags_path"].replace(":tag:", tag.lower())}" ' \
|
||||
f'class="mention hashtag" rel="noopener noreferrer">' \
|
||||
return f'<a class="mention hashtag" ' \
|
||||
f' href="{config["base_url"]}{config["tags_path"].replace(":tag:", tag.lower())}" ' \
|
||||
f'rel="noopener noreferrer">' \
|
||||
f'#<span>{tag}</span></a>'
|
||||
else:
|
||||
linkifier = None
|
||||
|
@ -254,7 +262,7 @@ class RawContentMixin(BaseEntity):
|
|||
display_name = mention
|
||||
rendered = rendered.replace(
|
||||
"@{%s}" % mention,
|
||||
f'@<a href="{mention}" class="mention"><span>{display_name}</span></a>',
|
||||
f'@<a class="mention" href="{mention}"><span>{display_name}</span></a>',
|
||||
)
|
||||
# Finally linkify remaining URL's that are not links
|
||||
rendered = process_text_links(rendered)
|
||||
|
|
|
@ -13,7 +13,7 @@ logger = logging.getLogger("federation")
|
|||
|
||||
def retrieve_remote_content(
|
||||
id: str, guid: str = None, handle: str = None, entity_type: str = None,
|
||||
sender_key_fetcher: Callable[[str], str] = None,
|
||||
sender_key_fetcher: Callable[[str], str] = None, cache: bool=True,
|
||||
):
|
||||
"""Retrieve remote content and return an Entity object.
|
||||
|
||||
|
|
|
@ -3,7 +3,9 @@ import logging
|
|||
import re
|
||||
from typing import Callable, Tuple, Union, Dict
|
||||
|
||||
from cryptography.exceptions import InvalidSignature
|
||||
from Crypto.PublicKey.RSA import RsaKey
|
||||
from requests_http_signature import HTTPSignatureHeaderAuth
|
||||
|
||||
from federation.entities.activitypub.enums import ActorType
|
||||
from federation.entities.mixins import BaseEntity
|
||||
|
@ -84,9 +86,18 @@ class Protocol:
|
|||
self.extract_actor()
|
||||
# Verify the message is from who it claims to be
|
||||
if not skip_author_verification:
|
||||
self.verify_signature()
|
||||
try:
|
||||
self.verify_signature()
|
||||
except (KeyError, InvalidSignature) as exc:
|
||||
logger.warning(f'Signature verification failed: {exc}')
|
||||
return self.actor, {}
|
||||
return self.actor, self.payload
|
||||
|
||||
def verify_signature(self):
|
||||
# Verify the HTTP signature
|
||||
verify_request_signature(self.request, self.get_contact_key(self.actor))
|
||||
sig = HTTPSignatureHeaderAuth.get_sig_struct(self.request)
|
||||
signer = sig.get('keyId', '').split('#')[0] if sig.get('keyId') else self.actor
|
||||
key = self.get_contact_key(signer)
|
||||
if self.request.headers.get('Signature') and not key:
|
||||
raise KeyError(f'No public key found for {signer}')
|
||||
verify_request_signature(self.request, key)
|
||||
|
|
|
@ -38,7 +38,7 @@ def verify_request_signature(request: RequestType, public_key: Union[str, bytes]
|
|||
key = encode_if_text(public_key)
|
||||
date_header = request.headers.get("Date")
|
||||
if not date_header:
|
||||
raise ValueError("Rquest Date header is missing")
|
||||
raise ValueError("Request Date header is missing")
|
||||
|
||||
ts = parse_http_date(date_header)
|
||||
dt = datetime.datetime.utcfromtimestamp(ts).replace(tzinfo=pytz.utc)
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
from unittest.mock import Mock
|
||||
from unittest.mock import Mock, DEFAULT
|
||||
|
||||
import pytest
|
||||
import inspect
|
||||
import requests
|
||||
|
||||
# noinspection PyUnresolvedReferences
|
||||
from federation.tests.fixtures.entities import *
|
||||
|
@ -21,7 +23,13 @@ def disable_network_calls(monkeypatch):
|
|||
def raise_for_status():
|
||||
pass
|
||||
|
||||
monkeypatch.setattr("requests.get", Mock(return_value=MockResponse))
|
||||
saved_get = requests.get
|
||||
def side_effect(*args, **kwargs):
|
||||
if "pyld/documentloader" in inspect.stack()[4][1]:
|
||||
return saved_get(*args, **kwargs)
|
||||
return DEFAULT
|
||||
|
||||
monkeypatch.setattr("requests.get", Mock(return_value=MockResponse, side_effect=side_effect))
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
|
|
|
@ -4,6 +4,7 @@ INSTALLED_APPS = tuple()
|
|||
|
||||
FEDERATION = {
|
||||
"base_url": "https://example.com",
|
||||
"federation_id": "https://example.com/u/john/",
|
||||
"get_object_function": "federation.tests.django.utils.get_object_function",
|
||||
"get_private_key_function": "federation.tests.django.utils.get_private_key",
|
||||
"get_profile_function": "federation.tests.django.utils.get_profile",
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
from unittest.mock import patch
|
||||
from pprint import pprint
|
||||
|
||||
# noinspection PyPackageRequirements
|
||||
from Crypto.PublicKey.RSA import RsaKey
|
||||
|
@ -43,7 +44,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -76,7 +77,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -135,7 +136,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -168,7 +169,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -223,7 +224,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -274,7 +275,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -322,7 +323,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
{'Hashtag': 'as:Hashtag'},
|
||||
'https://w3id.org/security/v1',
|
||||
{'sensitive': 'as:sensitive'},
|
||||
|
@ -431,7 +432,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
],
|
||||
'type': 'Delete',
|
||||
'id': 'http://127.0.0.1:8000/post/123456/#delete',
|
||||
|
@ -448,7 +449,7 @@ class TestEntitiesConvertToAS2:
|
|||
assert result == {
|
||||
'@context': [
|
||||
'https://www.w3.org/ns/activitystreams',
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation"},
|
||||
{"pyfed": "https://docs.jasonrobinson.me/ns/python-federation#"},
|
||||
],
|
||||
'type': 'Undo',
|
||||
'id': 'http://127.0.0.1:8000/post/123456/#delete',
|
||||
|
|
|
@ -70,8 +70,8 @@ class TestActivitypubEntityMappersReceive:
|
|||
assert post.raw_content == '<p><span class="h-card"><a class="u-url mention" ' \
|
||||
'href="https://dev.jasonrobinson.me/u/jaywink/">' \
|
||||
'@<span>jaywink</span></a></span> boom</p>'
|
||||
assert post.rendered_content == '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" ' \
|
||||
'class="u-url mention">@<span>jaywink</span></a></span> boom</p>'
|
||||
assert post.rendered_content == '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">' \
|
||||
'@<span>jaywink</span></a></span> boom</p>'
|
||||
assert post.id == "https://diaspodon.fr/users/jaywink/statuses/102356911717767237"
|
||||
assert post.actor_id == "https://diaspodon.fr/users/jaywink"
|
||||
assert post.public is True
|
||||
|
@ -101,8 +101,8 @@ class TestActivitypubEntityMappersReceive:
|
|||
post = entities[0]
|
||||
assert isinstance(post, ActivitypubPost)
|
||||
assert isinstance(post, Post)
|
||||
assert post.rendered_content == '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" ' \
|
||||
'class="u-url mention">@<span>jaywink</span></a></span> boom</p>'
|
||||
assert post.rendered_content == '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">' \
|
||||
'@<span>jaywink</span></a></span> boom</p>'
|
||||
assert post.raw_content == '<p><span class="h-card"><a class="u-url mention" ' \
|
||||
'href="https://dev.jasonrobinson.me/u/jaywink/">' \
|
||||
'@<span>jaywink</span></a></span> boom</p>'
|
||||
|
@ -127,7 +127,8 @@ class TestActivitypubEntityMappersReceive:
|
|||
assert len(entities) == 1
|
||||
post = entities[0]
|
||||
assert isinstance(post, ActivitypubPost)
|
||||
assert len(post._children) == 1
|
||||
# TODO: test video and audio attachment
|
||||
assert len(post._children) == 2
|
||||
photo = post._children[0]
|
||||
assert isinstance(photo, Image)
|
||||
assert photo.url == "https://files.mastodon.social/media_attachments/files/017/642/079/original/" \
|
||||
|
@ -270,6 +271,8 @@ class TestActivitypubEntityMappersReceive:
|
|||
entities = message_to_objects(ACTIVITYPUB_PROFILE, "http://example.com/1234")
|
||||
assert entities[0]._source_protocol == "activitypub"
|
||||
|
||||
@pytest.mark.skip
|
||||
# since calamus turns the whole payload into objects, the source payload is not kept
|
||||
def test_source_object(self):
|
||||
entities = message_to_objects(ACTIVITYPUB_PROFILE, "http://example.com/1234")
|
||||
entity = entities[0]
|
||||
|
|
|
@ -31,8 +31,8 @@ ACTIVITYPUB_COMMENT = {
|
|||
'atomUri': 'https://diaspodon.fr/users/jaywink/statuses/102356911717767237',
|
||||
'inReplyToAtomUri': 'https://dev.jasonrobinson.me/content/653bad70-41b3-42c9-89cb-c4ee587e68e4/',
|
||||
'conversation': 'tag:diaspodon.fr,2019-06-28:objectId=2347687:objectType=Conversation',
|
||||
'content': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>',
|
||||
'contentMap': {'en': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'content': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>',
|
||||
'contentMap': {'en': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'attachment': [],
|
||||
'tag': [{'type': 'Mention',
|
||||
'href': 'https://dev.jasonrobinson.me/p/d4574854-a5d7-42be-bfac-f70c16fcaa97/',
|
||||
|
@ -235,7 +235,8 @@ ACTIVITYPUB_RETRACTION = {
|
|||
},
|
||||
}
|
||||
|
||||
ACTIVITYPUB_RETRACTION_SHARE = {'@context': 'https://www.w3.org/ns/activitystreams',
|
||||
ACTIVITYPUB_RETRACTION_SHARE = {
|
||||
'@context': ['https://www.w3.org/ns/activitystreams',{"ostatus":"http://ostatus.org#","atomUri":"ostatus:atomUri"}],
|
||||
'id': 'https://mastodon.social/users/jaywink#announces/102571932479036987/undo',
|
||||
'type': 'Undo',
|
||||
'actor': 'https://mastodon.social/users/jaywink',
|
||||
|
@ -255,7 +256,7 @@ ACTIVITYPUB_RETRACTION_SHARE = {'@context': 'https://www.w3.org/ns/activitystrea
|
|||
'signatureValue': 'erI90OrrLqK1DiTqb4OO72XLcE7m74Fs4cH6s0plKKELHa7BZFQmtQYXKEgA9LwIUdSRrIurAUiaDWAw2sQZDg7opYo9x3z+GJDMZ3KxhBND7iHO8ZeGhV1ZBBKUMuBb3BfhOkd3ADp+RQ/fHcw6kOcViV2VsQduinAgQRpiutmGCLd/7eshqSF/aL4tFoAOyCskkm/5JDMNp2nnHNoXXJ+SZf7a8C6YPNDxWd7GzyQNeWkTBBdCJBPvS4HI0wQrTWemBvy6uP8k5QQ7FnqrrRrk/7zrcibFSInuYxiRTRV++rQ3irIbXNtoLhWQd36Iu5U22BclmkS1AAVBDUIj8w=='}}
|
||||
|
||||
ACTIVITYPUB_SHARE = {
|
||||
'@context': 'https://www.w3.org/ns/activitystreams',
|
||||
'@context': ['https://www.w3.org/ns/activitystreams',{"ostatus":"http://ostatus.org#","atomUri":"ostatus:atomUri"}],
|
||||
'id': 'https://mastodon.social/users/jaywink/statuses/102560701449465612/activity',
|
||||
'type': 'Announce',
|
||||
'actor': 'https://mastodon.social/users/jaywink',
|
||||
|
@ -327,8 +328,8 @@ ACTIVITYPUB_POST = {
|
|||
'atomUri': 'https://diaspodon.fr/users/jaywink/statuses/102356911717767237',
|
||||
'inReplyToAtomUri': None,
|
||||
'conversation': 'tag:diaspodon.fr,2019-06-28:objectId=2347687:objectType=Conversation',
|
||||
'content': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>',
|
||||
'contentMap': {'en': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'content': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>',
|
||||
'contentMap': {'en': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'attachment': [],
|
||||
'tag': [{'type': 'Mention',
|
||||
'href': 'https://dev.jasonrobinson.me/p/d4574854-a5d7-42be-bfac-f70c16fcaa97/',
|
||||
|
@ -524,12 +525,12 @@ ACTIVITYPUB_POST_WITH_SOURCE_BBCODE = {
|
|||
'atomUri': 'https://diaspodon.fr/users/jaywink/statuses/102356911717767237',
|
||||
'inReplyToAtomUri': None,
|
||||
'conversation': 'tag:diaspodon.fr,2019-06-28:objectId=2347687:objectType=Conversation',
|
||||
'content': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>',
|
||||
'content': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>',
|
||||
'source': {
|
||||
'content': "[url=https://example.com]jaywink[/url] boom",
|
||||
'mediaType': "text/bbcode",
|
||||
},
|
||||
'contentMap': {'en': '<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" class="u-url mention">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'contentMap': {'en': '<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">@<span>jaywink</span></a></span> boom</p>'},
|
||||
'attachment': [],
|
||||
'tag': [{'type': 'Mention',
|
||||
'href': 'https://dev.jasonrobinson.me/p/d4574854-a5d7-42be-bfac-f70c16fcaa97/',
|
||||
|
@ -545,7 +546,17 @@ ACTIVITYPUB_POST_WITH_SOURCE_BBCODE = {
|
|||
'signatureValue': 'SjDACS7Z/Cb1SEC3AtxEokID5SHAYl7kpys/hhmaRbpXuFKCxfj2P9BmH8QhLnuam3sENZlrnBOcB5NlcBhIfwo/Xh242RZBmPQf+edTVYVCe1j19dihcftNCHtnqAcKwp/51dNM/OlKu2730FrwvOUXVIPtB7iVqkseO9TRzDYIDj+zBTksnR/NAYtq6SUpmefXfON0uW3N3Uq6PGfExJaS+aeqRf8cPGkZFSIUQZwOLXbIpb7BFjJ1+y1OMOAJueqvikUprAit3v6BiNWurAvSQpC7WWMFUKyA79/xtkO9kIPA/Q4C9ryqdzxZJ0jDhXiaIIQj2JZfIADdjLZHJA=='}
|
||||
}
|
||||
|
||||
ACTIVITYPUB_POST_OBJECT = {
|
||||
ACTIVITYPUB_POST_OBJECT = {'@context': ['https://www.w3.org/ns/activitystreams',
|
||||
{'ostatus': 'http://ostatus.org#',
|
||||
'atomUri': 'ostatus:atomUri',
|
||||
'inReplyToAtomUri': 'ostatus:inReplyToAtomUri',
|
||||
'conversation': 'ostatus:conversation',
|
||||
'sensitive': 'as:sensitive',
|
||||
'Hashtag': 'as:Hashtag',
|
||||
'toot': 'http://joinmastodon.org/ns#',
|
||||
'Emoji': 'toot:Emoji',
|
||||
'focalPoint': {'@container': '@list', '@id': 'toot:focalPoint'},
|
||||
'blurhash': 'toot:blurhash'}],
|
||||
'id': 'https://diaspodon.fr/users/jaywink/statuses/102356911717767237',
|
||||
'type': 'Note',
|
||||
'summary': None,
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
import json
|
||||
from unittest.mock import patch, Mock
|
||||
|
||||
import pytest
|
||||
|
||||
from federation.entities.activitypub.entities import ActivitypubFollow, ActivitypubPost
|
||||
from federation.tests.fixtures.payloads import (
|
||||
ACTIVITYPUB_FOLLOW, ACTIVITYPUB_POST, ACTIVITYPUB_POST_OBJECT, ACTIVITYPUB_POST_OBJECT_IMAGES)
|
||||
|
@ -42,8 +44,10 @@ class TestRetrieveAndParseDocument:
|
|||
@patch("federation.utils.activitypub.fetch_document", autospec=True, return_value=(None, None, None))
|
||||
def test_calls_fetch_document(self, mock_fetch):
|
||||
retrieve_and_parse_document("https://example.com/foobar")
|
||||
# auth argument is passed with kwargs
|
||||
auth = mock_fetch.call_args.kwargs.get('auth', None)
|
||||
mock_fetch.assert_called_once_with(
|
||||
"https://example.com/foobar", extra_headers={'accept': 'application/activity+json'},
|
||||
"https://example.com/foobar", extra_headers={'accept': 'application/activity+json'}, auth=auth,
|
||||
)
|
||||
|
||||
@patch("federation.utils.activitypub.fetch_document", autospec=True, return_value=(
|
||||
|
|
|
@ -12,10 +12,10 @@ from federation.utils.network import (
|
|||
class TestFetchDocument:
|
||||
call_args = {"timeout": 10, "headers": {'user-agent': USER_AGENT}}
|
||||
|
||||
@patch("federation.utils.network.requests.get", autospec=True, return_value=Mock(status_code=200, text="foo"))
|
||||
@patch("federation.utils.network.requests.get", return_value=Mock(status_code=200, text="foo"))
|
||||
def test_extra_headers(self, mock_get):
|
||||
fetch_document("https://example.com/foo", extra_headers={'accept': 'application/activity+json'})
|
||||
mock_get.assert_called_once_with('https://example.com/foo', headers={
|
||||
mock_get.assert_called_once_with('https://example.com/foo', timeout=10, headers={
|
||||
'user-agent': USER_AGENT, 'accept': 'application/activity+json',
|
||||
})
|
||||
|
||||
|
|
|
@ -115,8 +115,8 @@ class TestProcessTextLinks:
|
|||
'<a href="/streams/tag/foobar">#foobar</a>'
|
||||
|
||||
def test_does_not_remove_mention_classes(self):
|
||||
assert process_text_links('<p><span class="h-card"><a href="https://dev.jasonrobinson.me/u/jaywink/" '
|
||||
'class="u-url mention">@<span>jaywink</span></a></span> boom</p>') == \
|
||||
assert process_text_links('<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/">'
|
||||
'@<span>jaywink</span></a></span> boom</p>') == \
|
||||
'<p><span class="h-card"><a class="u-url mention" href="https://dev.jasonrobinson.me/u/jaywink/" ' \
|
||||
'rel="nofollow" target="_blank">@<span>jaywink</span></a></span> boom</p>'
|
||||
|
||||
|
|
|
@ -3,12 +3,18 @@ import logging
|
|||
from typing import Optional, Any
|
||||
|
||||
from federation.entities.activitypub.entities import ActivitypubProfile
|
||||
from federation.entities.activitypub.mappers import message_to_objects
|
||||
from federation.protocols.activitypub.signing import get_http_authentication
|
||||
from federation.utils.network import fetch_document, try_retrieve_webfinger_document
|
||||
from federation.utils.text import decode_if_bytes, validate_handle
|
||||
|
||||
logger = logging.getLogger('federation')
|
||||
|
||||
try:
|
||||
from federation.utils.django import get_federation_user
|
||||
federation_user = get_federation_user()
|
||||
except (ImportError, AttributeError):
|
||||
federation_user = None
|
||||
logger.warning("django is required for get requests signing")
|
||||
|
||||
def get_profile_id_from_webfinger(handle: str) -> Optional[str]:
|
||||
"""
|
||||
|
@ -36,11 +42,12 @@ def retrieve_and_parse_document(fid: str) -> Optional[Any]:
|
|||
"""
|
||||
Retrieve remote document by ID and return the entity.
|
||||
"""
|
||||
document, status_code, ex = fetch_document(fid, extra_headers={'accept': 'application/activity+json'})
|
||||
from federation.entities.activitypub.models import element_to_objects # Circulars
|
||||
document, status_code, ex = fetch_document(fid, extra_headers={'accept': 'application/activity+json'},
|
||||
auth=get_http_authentication(federation_user.rsa_private_key,f'{federation_user.id}#main-key') if federation_user else None)
|
||||
if document:
|
||||
document = json.loads(decode_if_bytes(document))
|
||||
entities = message_to_objects(document, fid)
|
||||
logger.info("retrieve_and_parse_document - found %s entities", len(entities))
|
||||
entities = element_to_objects(document)
|
||||
if entities:
|
||||
logger.info("retrieve_and_parse_document - using first entity: %s", entities[0])
|
||||
return entities[0]
|
||||
|
@ -66,3 +73,4 @@ def retrieve_and_parse_profile(fid: str) -> Optional[ActivitypubProfile]:
|
|||
profile, ex)
|
||||
return
|
||||
return profile
|
||||
|
||||
|
|
|
@ -161,8 +161,7 @@ def parse_profile_from_hcard(hcard: str, handle: str):
|
|||
|
||||
|
||||
def retrieve_and_parse_content(
|
||||
id: str, guid: str, handle: str, entity_type: str, sender_key_fetcher: Callable[[str], str]=None,
|
||||
):
|
||||
id: str, guid: str, handle: str, entity_type: str, sender_key_fetcher: Callable[[str], str]=None):
|
||||
"""Retrieve remote content and return an Entity class instance.
|
||||
|
||||
This is basically the inverse of receiving an entity. Instead, we fetch it, then call "handle_receive".
|
||||
|
|
|
@ -2,6 +2,7 @@ import importlib
|
|||
|
||||
from django.conf import settings
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
from federation.types import UserType
|
||||
|
||||
|
||||
def get_configuration():
|
||||
|
@ -27,6 +28,7 @@ def get_configuration():
|
|||
"get_private_key_function" in configuration,
|
||||
"get_profile_function" in configuration,
|
||||
"base_url" in configuration,
|
||||
"federation_id" in configuration,
|
||||
]):
|
||||
raise ImproperlyConfigured("Missing required FEDERATION settings, please check documentation.")
|
||||
return configuration
|
||||
|
@ -42,3 +44,18 @@ def get_function_from_config(item):
|
|||
module = importlib.import_module(module_path)
|
||||
func = getattr(module, func_name)
|
||||
return func
|
||||
|
||||
def get_federation_user():
|
||||
config = get_configuration()
|
||||
if not config.get('federation_id'): return None
|
||||
|
||||
try:
|
||||
get_key = get_function_from_config("get_private_key_function")
|
||||
except AttributeError:
|
||||
return None
|
||||
|
||||
key = get_key(config['federation_id'])
|
||||
if not key: return None
|
||||
|
||||
return UserType(id=config['federation_id'], private_key=key)
|
||||
|
||||
|
|
|
@ -31,7 +31,7 @@ def fetch_content_type(url: str) -> Optional[str]:
|
|||
return response.headers.get('Content-Type')
|
||||
|
||||
|
||||
def fetch_document(url=None, host=None, path="/", timeout=10, raise_ssl_errors=True, extra_headers=None):
|
||||
def fetch_document(url=None, host=None, path="/", timeout=10, raise_ssl_errors=True, extra_headers=None, **kwargs):
|
||||
"""Helper method to fetch remote document.
|
||||
|
||||
Must be given either the ``url`` or ``host``.
|
||||
|
@ -44,6 +44,7 @@ def fetch_document(url=None, host=None, path="/", timeout=10, raise_ssl_errors=T
|
|||
:arg timeout: Seconds to wait for response (defaults to 10)
|
||||
:arg raise_ssl_errors: Pass False if you want to try HTTP even for sites with SSL errors (default True)
|
||||
:arg extra_headers: Optional extra headers dictionary to add to requests
|
||||
:arg kwargs holds extra args passed to requests.get
|
||||
:returns: Tuple of document (str or None), status code (int or None) and error (an exception class instance or None)
|
||||
:raises ValueError: If neither url nor host are given as parameters
|
||||
"""
|
||||
|
@ -59,7 +60,7 @@ def fetch_document(url=None, host=None, path="/", timeout=10, raise_ssl_errors=T
|
|||
# Use url since it was given
|
||||
logger.debug("fetch_document: trying %s", url)
|
||||
try:
|
||||
response = requests.get(url, timeout=timeout, headers=headers)
|
||||
response = requests.get(url, timeout=timeout, headers=headers, **kwargs)
|
||||
logger.debug("fetch_document: found document, code %s", response.status_code)
|
||||
response.raise_for_status()
|
||||
return response.text, response.status_code, None
|
||||
|
|
2
setup.py
2
setup.py
|
@ -29,6 +29,7 @@ setup(
|
|||
install_requires=[
|
||||
"attrs",
|
||||
"bleach>3.0",
|
||||
"calamus",
|
||||
"commonmark",
|
||||
"cryptography",
|
||||
"cssselect>=0.9.2",
|
||||
|
@ -43,6 +44,7 @@ setup(
|
|||
"pytz",
|
||||
"PyYAML",
|
||||
"requests>=2.8.0",
|
||||
"requests-cache",
|
||||
"requests-http-signature-jaywink>=0.1.0.dev0",
|
||||
],
|
||||
include_package_data=True,
|
||||
|
|
2
tox.ini
2
tox.ini
|
@ -4,7 +4,7 @@
|
|||
# and then run "tox" from this directory.
|
||||
|
||||
[tox]
|
||||
envlist = py38
|
||||
envlist = py310
|
||||
|
||||
[testenv]
|
||||
usedevelop = True
|
||||
|
|
Ładowanie…
Reference in New Issue