mirror of
https://github.com/markqvist/NomadNet.git
synced 2025-12-30 13:14:40 +01:00
Compare commits
37 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
efd6285012 | ||
|
|
11ccc76d93 | ||
|
|
f2d0ea9910 | ||
|
|
b35cc04f57 | ||
|
|
156e4f379e | ||
|
|
004fa3690e | ||
|
|
b4c656770f | ||
|
|
d3ddfd6c9c | ||
|
|
c4bf97dae2 | ||
|
|
35910343fe | ||
|
|
5f3b2eb020 | ||
|
|
93cc790ebd | ||
|
|
7ed85ad398 | ||
|
|
5ceb0c671e | ||
|
|
3e7e55a9ca | ||
|
|
a7a88e6a3e | ||
|
|
5de1b93fd9 | ||
|
|
c9ca0a2fd1 | ||
|
|
cb87148ec3 | ||
|
|
9596361a6b | ||
|
|
d4a3a91e04 | ||
|
|
dbd1d87adb | ||
|
|
db3642ee05 | ||
|
|
6b74e49b0f | ||
|
|
5253cccfa7 | ||
|
|
a2e6a06a35 | ||
|
|
9c79496504 | ||
|
|
eafe77718f | ||
|
|
abde448e00 | ||
|
|
6d2bf21f0d | ||
|
|
eac9021c75 | ||
|
|
e6688b157e | ||
|
|
ad2cefa329 | ||
|
|
373315423e | ||
|
|
025cae6ebf | ||
|
|
0baebe5a3c | ||
|
|
3fedd0af30 |
33
MIRROR.md
Normal file
33
MIRROR.md
Normal file
@@ -0,0 +1,33 @@
|
||||
This repository is a public mirror. All potential future development is happening elsewhere.
|
||||
|
||||
I am stepping back from all public-facing interaction with this project. Reticulum has always been primarily my work, and continuing in the current public, internet-facing model is no longer sustainable.
|
||||
|
||||
The software remains available for use as-is. Occasional updates may appear at unpredictable intervals, but there will be no support, no responses to issues, no discussions, and no community management in this or any other public venue. If it doesn't work for you, it doesn't work. That is the entire extent of available troubleshooting assistance I can offer you.
|
||||
|
||||
If you've followed this project for a while, you already know what this means. You know who designed, wrote and tested this, and you know how many years of my life it took. You'll also know about both my particular challenges and strengths, and how I believe anything worth building needs to be built and maintained with our own hands.
|
||||
|
||||
Seven months ago, I said I needed to step back, that I was exhausted, and that I needed to recover. I believed a public resolve would be enough to effectuate that, but while striving to get just a few more useful features and protocols out, the unproductive requests and demands also ramped up, and I got pulled back into the same patterns and draining interactions that I'd explicitly said I couldn't sustain anymore.
|
||||
|
||||
So here's what you might have already guessed: I'm done playing the game by rules I can't win at.
|
||||
|
||||
Everything you need is right here, and by any sensible measure, it's done. Anyone who wants to invest the time, skill and persistence can build on it, or completely re-imagine it with different priorities. That was always the point.
|
||||
|
||||
The people who actually contributed - you know who you are, and you know I mean it when I say: Thank you. All of you who've used this to build something real - that was the goal, and you did it without needing me to hold your hand.
|
||||
|
||||
The rest of you: You have what you need. Use it or don't. I am not going to be the person who explains it to you anymore.
|
||||
|
||||
This is not a temporary break. It's not "see you after some rest", but a recognition that the current model is fundamentally incompatible with my life, my health, and my reality.
|
||||
|
||||
If you want to support continued work, you can do so at the donation links listed in this repository. But please understand, that this is not purchasing support or guaranteeing updates. It is support for work that happens on my timeline, according to my capacity, which at the moment is not what it was.
|
||||
|
||||
If you want Reticulum to continue evolving, you have the power to make that happen. The protocol is public domain. The code is open source. Everything you need is right here. I've provided the tools, but building what comes next is not my responsibility anymore. It's yours.
|
||||
|
||||
To the small group of people who has actually been here, and understood what this work was and what it cost - you already know where to find me if it actually matters.
|
||||
|
||||
To everyone else: This is where we part ways. No hard feelings. It's just time.
|
||||
|
||||
---
|
||||
|
||||
असतो मा सद्गमय
|
||||
तमसो मा ज्योतिर्गमय
|
||||
मृत्योर्मा अमृतं गमय
|
||||
11
README.md
11
README.md
@@ -1,5 +1,7 @@
|
||||
# Nomad Network - Communicate Freely
|
||||
|
||||
*This repository is [a public mirror](./MIRROR.md). All development is happening elsewhere.*
|
||||
|
||||
Off-grid, resilient mesh communication with strong encryption, forward secrecy and extreme privacy.
|
||||
|
||||

|
||||
@@ -148,14 +150,17 @@ You can help support the continued development of open, free and private communi
|
||||
```
|
||||
- Bitcoin
|
||||
```
|
||||
bc1p4a6axuvl7n9hpapfj8sv5reqj8kz6uxa67d5en70vzrttj0fmcusgxsfk5
|
||||
bc1pgqgu8h8xvj4jtafslq396v7ju7hkgymyrzyqft4llfslz5vp99psqfk3a6
|
||||
```
|
||||
- Ethereum
|
||||
```
|
||||
0xae89F3B94fC4AD6563F0864a55F9a697a90261ff
|
||||
0x91C421DdfB8a30a49A71d63447ddb54cEBe3465E
|
||||
```
|
||||
- Liberapay: https://liberapay.com/Reticulum/
|
||||
|
||||
- Ko-Fi: https://ko-fi.com/markqvist
|
||||
|
||||
|
||||
## Development Roadmap
|
||||
|
||||
- New major features
|
||||
@@ -171,8 +176,6 @@ You can help support the continued development of open, free and private communi
|
||||
- Better navigation handling when requests fail (also because of closed links)
|
||||
- Retry failed messages mechanism
|
||||
- Re-arrange buttons to be more consistent
|
||||
- Input field for pages
|
||||
- Post mechanism
|
||||
- Term compatibility notice in readme
|
||||
- Selected icon in conversation list
|
||||
- Possibly a Search Local Nodes function
|
||||
|
||||
@@ -6,6 +6,9 @@ import nomadnet
|
||||
import threading
|
||||
import RNS.vendor.umsgpack as msgpack
|
||||
|
||||
from LXMF import pn_announce_data_is_valid
|
||||
from nomadnet.util import strip_modifiers
|
||||
|
||||
class PNAnnounceHandler:
|
||||
def __init__(self, owner):
|
||||
self.aspect_filter = "lxmf.propagation"
|
||||
@@ -13,10 +16,10 @@ class PNAnnounceHandler:
|
||||
|
||||
def received_announce(self, destination_hash, announced_identity, app_data):
|
||||
try:
|
||||
if type(app_data) == bytes:
|
||||
if pn_announce_data_is_valid(app_data):
|
||||
data = msgpack.unpackb(app_data)
|
||||
|
||||
if data[0] == True:
|
||||
if data[2] == True:
|
||||
RNS.log("Received active propagation node announce from "+RNS.prettyhexrep(destination_hash))
|
||||
|
||||
associated_peer = RNS.Destination.hash_from_name_and_identity("lxmf.delivery", announced_identity)
|
||||
@@ -193,7 +196,8 @@ class Directory:
|
||||
found_node = True
|
||||
break
|
||||
|
||||
if not found_node:
|
||||
# TODO: Remove debug and rethink this (needs way to set PN when node is saved)
|
||||
if True or not found_node:
|
||||
if self.app.compact_stream:
|
||||
try:
|
||||
remove_announces = []
|
||||
@@ -226,7 +230,7 @@ class Directory:
|
||||
|
||||
def display_name(self, source_hash):
|
||||
if source_hash in self.directory_entries:
|
||||
return self.directory_entries[source_hash].display_name
|
||||
return strip_modifiers(self.directory_entries[source_hash].display_name)
|
||||
else:
|
||||
return None
|
||||
|
||||
@@ -238,7 +242,7 @@ class Directory:
|
||||
if dn == None:
|
||||
return RNS.prettyhexrep(source_hash)
|
||||
else:
|
||||
return dn+" <"+RNS.hexrep(source_hash, delimit=False)+">"
|
||||
return strip_modifiers(dn)+" <"+RNS.hexrep(source_hash, delimit=False)+">"
|
||||
else:
|
||||
return "<"+RNS.hexrep(source_hash, delimit=False)+">"
|
||||
else:
|
||||
@@ -247,13 +251,13 @@ class Directory:
|
||||
if dn == None:
|
||||
return RNS.prettyhexrep(source_hash)
|
||||
else:
|
||||
return dn
|
||||
return strip_modifiers(dn)
|
||||
else:
|
||||
return "<"+RNS.hexrep(source_hash, delimit=False)+">"
|
||||
|
||||
def alleged_display_str(self, source_hash):
|
||||
if source_hash in self.directory_entries:
|
||||
return self.directory_entries[source_hash].display_name
|
||||
return strip_modifiers(self.directory_entries[source_hash].display_name)
|
||||
else:
|
||||
return None
|
||||
|
||||
|
||||
@@ -122,13 +122,16 @@ class NomadNetworkApp:
|
||||
self.page_refresh_interval = 0
|
||||
self.file_refresh_interval = 0
|
||||
|
||||
self.static_peers = []
|
||||
self.peer_announce_at_start = True
|
||||
self.try_propagation_on_fail = True
|
||||
self.disable_propagation = False
|
||||
self.disable_propagation = True
|
||||
self.notify_on_new_message = True
|
||||
|
||||
self.lxmf_max_propagation_size = None
|
||||
self.lxmf_max_sync_size = None
|
||||
self.lxmf_max_incoming_size = None
|
||||
self.node_propagation_cost = LXMF.LXMRouter.PROPAGATION_COST
|
||||
|
||||
self.periodic_lxmf_sync = True
|
||||
self.lxmf_sync_interval = 360*60
|
||||
@@ -242,8 +245,11 @@ class NomadNetworkApp:
|
||||
self.peer_settings["served_file_requests"] = 0
|
||||
|
||||
except Exception as e:
|
||||
RNS.log("Could not load local peer settings from "+self.peersettingspath, RNS.LOG_ERROR)
|
||||
RNS.log("The contained exception was: %s" % (str(e)), RNS.LOG_ERROR)
|
||||
RNS.logdest = RNS.LOG_STDOUT
|
||||
RNS.log(f"Could not load local peer settings from {self.peersettingspath}", RNS.LOG_ERROR)
|
||||
RNS.log(f"The contained exception was: {e}", RNS.LOG_ERROR)
|
||||
RNS.log(f"This likely means that the peer settings file has become corrupt.", RNS.LOG_ERROR)
|
||||
RNS.log(f"You can try deleting the file at {self.peersettingspath} and restarting nomadnet.", RNS.LOG_ERROR)
|
||||
nomadnet.panic()
|
||||
else:
|
||||
try:
|
||||
@@ -302,8 +308,8 @@ class NomadNetworkApp:
|
||||
|
||||
self.message_router = LXMF.LXMRouter(
|
||||
identity = self.identity, storagepath = self.storagepath, autopeer = True,
|
||||
propagation_limit = self.lxmf_max_propagation_size, delivery_limit = self.lxmf_max_incoming_size,
|
||||
max_peers = self.max_peers, static_peers = static_peers,
|
||||
propagation_limit = self.lxmf_max_propagation_size, sync_limit = self.lxmf_max_sync_size, delivery_limit = self.lxmf_max_incoming_size,
|
||||
max_peers = self.max_peers, static_peers = static_peers, propagation_cost=self.node_propagation_cost
|
||||
)
|
||||
|
||||
self.message_router.register_delivery_callback(self.lxmf_delivery)
|
||||
@@ -555,9 +561,9 @@ class NomadNetworkApp:
|
||||
return self.message_router.get_outbound_propagation_node()
|
||||
|
||||
def save_peer_settings(self):
|
||||
file = open(self.peersettingspath, "wb")
|
||||
file.write(msgpack.packb(self.peer_settings))
|
||||
file.close()
|
||||
tmp_path = f"{self.peersettingspath}.tmp"
|
||||
with open(tmp_path, "wb") as file: file.write(msgpack.packb(self.peer_settings))
|
||||
os.replace(tmp_path, self.peersettingspath)
|
||||
|
||||
def lxmf_delivery(self, message):
|
||||
time_string = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime(message.timestamp))
|
||||
@@ -802,7 +808,7 @@ class NomadNetworkApp:
|
||||
if not "intro_time" in self.config["textui"]:
|
||||
self.config["textui"]["intro_time"] = 1
|
||||
else:
|
||||
self.config["textui"]["intro_time"] = self.config["textui"].as_int("intro_time")
|
||||
self.config["textui"]["intro_time"] = self.config["textui"].as_float("intro_time")
|
||||
|
||||
if not "intro_text" in self.config["textui"]:
|
||||
self.config["textui"]["intro_text"] = "Nomad Network"
|
||||
@@ -876,7 +882,7 @@ class NomadNetworkApp:
|
||||
self.node_name = self.config["node"]["node_name"]
|
||||
|
||||
if not "disable_propagation" in self.config["node"]:
|
||||
self.disable_propagation = False
|
||||
self.disable_propagation = True
|
||||
else:
|
||||
self.disable_propagation = self.config["node"].as_bool("disable_propagation")
|
||||
|
||||
@@ -888,6 +894,14 @@ class NomadNetworkApp:
|
||||
value = 1
|
||||
self.lxmf_max_propagation_size = value
|
||||
|
||||
if not "max_sync_size" in self.config["node"]:
|
||||
self.lxmf_max_sync_size = 256*40
|
||||
else:
|
||||
value = self.config["node"].as_float("max_sync_size")
|
||||
if value < self.lxmf_max_propagation_size:
|
||||
value = self.lxmf_max_propagation_size
|
||||
self.lxmf_max_sync_size = value
|
||||
|
||||
if not "announce_at_start" in self.config["node"]:
|
||||
self.node_announce_at_start = False
|
||||
else:
|
||||
@@ -901,6 +915,13 @@ class NomadNetworkApp:
|
||||
if value < 1:
|
||||
value = 1
|
||||
self.node_announce_interval = value
|
||||
|
||||
if not "propagation_cost" in self.config["node"]:
|
||||
self.node_propagation_cost = 16
|
||||
else:
|
||||
value = self.config["node"].as_int("propagation_cost")
|
||||
if value < 13: value = 13
|
||||
self.node_propagation_cost = value
|
||||
|
||||
if "pages_path" in self.config["node"]:
|
||||
self.pagespath = self.config["node"]["pages_path"]
|
||||
@@ -1162,14 +1183,55 @@ announce_at_start = Yes
|
||||
|
||||
# When Nomad Network is hosting a page-serving
|
||||
# node, it can also act as an LXMF propagation
|
||||
# node. If there is already a large amount of
|
||||
# node. This is a convenient feature that lets
|
||||
# you easily set up and run a propagation node
|
||||
# on the network, but it is not as fully
|
||||
# featured as using the lxmd program to host a
|
||||
# propagation node. For complete control and
|
||||
# flexibility, use lxmd to run a PN. For a
|
||||
# small local system or network, the built-in
|
||||
# PN functionality will suffice for most cases.
|
||||
#
|
||||
# If there is already a large amount of
|
||||
# propagation nodes on the network, or you
|
||||
# simply want to run a pageserving-only node,
|
||||
# you can disable running a propagation node.
|
||||
# you should disable running a propagation node.
|
||||
# Due to lots of propagation nodes being
|
||||
# available, this is currently the default.
|
||||
|
||||
disable_propagation = Yes
|
||||
|
||||
# For clients and other propagation nodes
|
||||
# delivering messages via this node, you can
|
||||
# configure the minimum required propagation
|
||||
# stamp costs. All messages delivered to the
|
||||
# propagation node network must have a valid
|
||||
# propagation stamp, or they will be rejected.
|
||||
# Clients automatically detect the stamp cost
|
||||
# for the node they are delivering to, and
|
||||
# compute a corresponding stamp before trying
|
||||
# to deliver the message to the propagation
|
||||
# node.
|
||||
#
|
||||
# Propagation stamps are easier to verify in
|
||||
# large batches, and therefore also somewhat
|
||||
# easier to compute for the senders. As such,
|
||||
# a reasonable propagation stamp cost should
|
||||
# be a bit higher than the normal peer-to-peer
|
||||
# stamp costs.
|
||||
#
|
||||
# Propagation stamps does not incur any extra
|
||||
# load for propagation nodes processing them,
|
||||
# since they are only required to verify that
|
||||
# they are correct, and only the generation
|
||||
# is computationally costly. Setting a sensible
|
||||
# propagation stamp cost (and periodically
|
||||
# checking the average network consensus) helps
|
||||
# keep spam and misuse out of the propagation
|
||||
# node network.
|
||||
|
||||
propagation_cost = 16
|
||||
|
||||
# The maximum amount of storage to use for
|
||||
# the LXMF Propagation Node message store,
|
||||
# specified in megabytes. When this limit
|
||||
@@ -1179,19 +1241,26 @@ disable_propagation = Yes
|
||||
# new and small. Large and old messages will
|
||||
# be removed first. This setting is optional
|
||||
# and defaults to 2 gigabytes.
|
||||
|
||||
# message_storage_limit = 2000
|
||||
|
||||
# The maximum accepted transfer size per in-
|
||||
# coming propagation transfer, in kilobytes.
|
||||
# This also sets the upper limit for the size
|
||||
# of single messages accepted onto this node.
|
||||
# coming propagation message, in kilobytes.
|
||||
# This sets the upper limit for the size of
|
||||
# single messages accepted onto this node.
|
||||
|
||||
max_transfer_size = 256
|
||||
|
||||
# The maximum accepted transfer size per in-
|
||||
# coming propagation node sync.
|
||||
#
|
||||
# If a node wants to propagate a larger number
|
||||
# of messages to this node, than what can fit
|
||||
# within this limit, it will prioritise sending
|
||||
# the smallest, newest messages first, and try
|
||||
# the smallest messages first, and try again
|
||||
# with any remaining messages at a later point.
|
||||
max_transfer_size = 256
|
||||
|
||||
max_sync_size = 10240
|
||||
|
||||
# You can tell the LXMF message router to
|
||||
# prioritise storage for one or more
|
||||
@@ -1200,29 +1269,34 @@ max_transfer_size = 256
|
||||
# keeping messages for destinations specified
|
||||
# with this option. This setting is optional,
|
||||
# and generally you do not need to use it.
|
||||
|
||||
# prioritise_destinations = 41d20c727598a3fbbdf9106133a3a0ed, d924b81822ca24e68e2effea99bcb8cf
|
||||
|
||||
# You can configure the maximum number of other
|
||||
# propagation nodes that this node will peer
|
||||
# with automatically. The default is 50.
|
||||
# max_peers = 25
|
||||
# with automatically. The default is 20.
|
||||
|
||||
# max_peers = 20
|
||||
|
||||
# You can configure a list of static propagation
|
||||
# node peers, that this node will always be
|
||||
# peered with, by specifying a list of
|
||||
# destination hashes.
|
||||
|
||||
# static_peers = e17f833c4ddf8890dd3a79a6fea8161d, 5a2d0029b6e5ec87020abaea0d746da4
|
||||
|
||||
# You can specify the interval in minutes for
|
||||
# rescanning the hosted pages path. By default,
|
||||
# this option is disabled, and the pages path
|
||||
# will only be scanned on startup.
|
||||
|
||||
# page_refresh_interval = 0
|
||||
|
||||
# You can specify the interval in minutes for
|
||||
# rescanning the hosted files path. By default,
|
||||
# this option is disabled, and the files path
|
||||
# will only be scanned on startup.
|
||||
|
||||
# file_refresh_interval = 0
|
||||
|
||||
[printing]
|
||||
@@ -1231,6 +1305,7 @@ max_transfer_size = 256
|
||||
# various kinds of information and messages.
|
||||
|
||||
# Printing messages is disabled by default
|
||||
|
||||
print_messages = No
|
||||
|
||||
# You can configure a custom template for
|
||||
@@ -1238,24 +1313,29 @@ print_messages = No
|
||||
# option, set a path to the template and
|
||||
# restart Nomad Network, a default template
|
||||
# will be created that you can edit.
|
||||
|
||||
# message_template = ~/.nomadnetwork/print_template_msg.txt
|
||||
|
||||
# You can configure Nomad Network to only
|
||||
# print messages from trusted destinations.
|
||||
|
||||
# print_from = trusted
|
||||
|
||||
# Or specify the source LXMF addresses that
|
||||
# will automatically have messages printed
|
||||
# on arrival.
|
||||
|
||||
# print_from = 76fe5751a56067d1e84eef3e88eab85b, 0e70b5848eb57c13154154feaeeb89b7
|
||||
|
||||
# Or allow printing from anywhere, if you
|
||||
# are feeling brave and adventurous.
|
||||
|
||||
# print_from = everywhere
|
||||
|
||||
# You can configure the printing command.
|
||||
# This will use the default CUPS printer on
|
||||
# your system.
|
||||
|
||||
print_command = lp
|
||||
|
||||
# You can specify what printer to use
|
||||
|
||||
@@ -8,8 +8,10 @@ from .Node import Node
|
||||
from .ui import *
|
||||
|
||||
|
||||
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
__all__ = [ os.path.basename(f)[:-3] for f in modules if not f.endswith('__init__.py')]
|
||||
py_modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
pyc_modules = glob.glob(os.path.dirname(__file__)+"/*.pyc")
|
||||
modules = py_modules+pyc_modules
|
||||
__all__ = list(set([os.path.basename(f).replace(".pyc", "").replace(".py", "") for f in modules if not (f.endswith("__init__.py") or f.endswith("__init__.pyc"))]))
|
||||
|
||||
def panic():
|
||||
os._exit(255)
|
||||
@@ -1 +1 @@
|
||||
__version__ = "0.8.0"
|
||||
__version__ = "0.9.5"
|
||||
|
||||
@@ -3,8 +3,10 @@ import glob
|
||||
import RNS
|
||||
import nomadnet
|
||||
|
||||
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
__all__ = [ os.path.basename(f)[:-3] for f in modules if not f.endswith('__init__.py')]
|
||||
py_modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
pyc_modules = glob.glob(os.path.dirname(__file__)+"/*.pyc")
|
||||
modules = py_modules+pyc_modules
|
||||
__all__ = list(set([os.path.basename(f).replace(".pyc", "").replace(".py", "") for f in modules if not (f.endswith("__init__.py") or f.endswith("__init__.pyc"))]))
|
||||
|
||||
|
||||
UI_NONE = 0x00
|
||||
|
||||
@@ -8,9 +8,11 @@ import shutil
|
||||
import nomadnet
|
||||
import subprocess
|
||||
import threading
|
||||
from threading import Lock
|
||||
from .MicronParser import markup_to_attrmaps, make_style, default_state
|
||||
from nomadnet.Directory import DirectoryEntry
|
||||
from nomadnet.vendor.Scrollable import *
|
||||
from nomadnet.util import strip_modifiers
|
||||
|
||||
class BrowserFrame(urwid.Frame):
|
||||
def keypress(self, size, key):
|
||||
@@ -36,19 +38,20 @@ class BrowserFrame(urwid.Frame):
|
||||
if hasattr(self.delegate, "page_pile") and self.delegate.page_pile:
|
||||
def df(loop, user_data):
|
||||
st = None
|
||||
nf = self.delegate.page_pile.focus
|
||||
if hasattr(nf, "key_timeout"):
|
||||
st = nf
|
||||
elif hasattr(nf, "original_widget"):
|
||||
no = nf.original_widget
|
||||
if hasattr(no, "original_widget"):
|
||||
st = no.original_widget
|
||||
else:
|
||||
if hasattr(no, "key_timeout"):
|
||||
st = no
|
||||
|
||||
if st and hasattr(st, "key_timeout") and hasattr(st, "keypress") and callable(st.keypress):
|
||||
st.keypress(None, None)
|
||||
if self.delegate.page_pile:
|
||||
nf = self.delegate.page_pile.focus
|
||||
if hasattr(nf, "key_timeout"):
|
||||
st = nf
|
||||
elif hasattr(nf, "original_widget"):
|
||||
no = nf.original_widget
|
||||
if hasattr(no, "original_widget"):
|
||||
st = no.original_widget
|
||||
else:
|
||||
if hasattr(no, "key_timeout"):
|
||||
st = no
|
||||
|
||||
if st and hasattr(st, "key_timeout") and hasattr(st, "keypress") and callable(st.keypress):
|
||||
st.keypress(None, None)
|
||||
|
||||
nomadnet.NomadNetworkApp.get_shared_instance().ui.loop.set_alarm_in(0.25, df)
|
||||
|
||||
@@ -111,6 +114,9 @@ class Browser:
|
||||
self.frame = None
|
||||
self.attr_maps = []
|
||||
self.page_pile = None
|
||||
self.page_partials = {}
|
||||
self.updater_running = False
|
||||
self.partial_updater_lock = Lock()
|
||||
self.build_display()
|
||||
|
||||
self.history = []
|
||||
@@ -179,6 +185,7 @@ class Browser:
|
||||
return destination_type
|
||||
|
||||
def handle_link(self, link_target, link_data = None):
|
||||
partial_ids = None
|
||||
request_data = None
|
||||
if link_data != None:
|
||||
link_fields = []
|
||||
@@ -229,12 +236,6 @@ class Browser:
|
||||
else:
|
||||
pass # do nothing if checkbox is not check
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
recurse_down(self.attr_maps)
|
||||
RNS.log("Including request data: "+str(request_data), RNS.LOG_DEBUG)
|
||||
|
||||
@@ -244,6 +245,10 @@ class Browser:
|
||||
if len(components) == 2:
|
||||
destination_type = self.expand_shorthands(components[0])
|
||||
link_target = components[1]
|
||||
elif link_target.startswith("p:"):
|
||||
comps = link_target.split(":")
|
||||
if len(comps) > 1: partial_ids = comps[1:]
|
||||
destination_type = "partial"
|
||||
else:
|
||||
destination_type = "nomadnetwork.node"
|
||||
link_target = components[0]
|
||||
@@ -264,6 +269,9 @@ class Browser:
|
||||
RNS.log("Passing LXMF link to handler", RNS.LOG_DEBUG)
|
||||
self.handle_lxmf_link(link_target)
|
||||
|
||||
elif destination_type == "partial":
|
||||
if partial_ids != None and len(partial_ids) > 0: self.handle_partial_updates(partial_ids)
|
||||
|
||||
else:
|
||||
RNS.log("No known handler for destination type "+str(destination_type), RNS.LOG_DEBUG)
|
||||
self.browser_footer = urwid.Text("Could not open link: "+"No known handler for destination type "+str(destination_type))
|
||||
@@ -317,6 +325,7 @@ class Browser:
|
||||
self.browser_footer = urwid.Text("")
|
||||
|
||||
self.page_pile = None
|
||||
self.page_partials = {}
|
||||
self.browser_body = urwid.Filler(
|
||||
urwid.Text("Disconnected\n"+self.g["arrow_l"]+" "+self.g["arrow_r"], align=urwid.CENTER),
|
||||
urwid.MIDDLE,
|
||||
@@ -373,6 +382,7 @@ class Browser:
|
||||
if self.status == Browser.DISCONECTED:
|
||||
self.display_widget.set_attr_map({None: "inactive_text"})
|
||||
self.page_pile = None
|
||||
self.page_partials = {}
|
||||
self.browser_body = urwid.Filler(
|
||||
urwid.Text("Disconnected\n"+self.g["arrow_l"]+" "+self.g["arrow_r"], align=urwid.CENTER),
|
||||
urwid.MIDDLE,
|
||||
@@ -446,7 +456,234 @@ class Browser:
|
||||
pile = urwid.Pile(self.attr_maps)
|
||||
pile.automove_cursor_on_scroll = True
|
||||
self.page_pile = pile
|
||||
self.page_partials = {}
|
||||
self.browser_body = urwid.AttrMap(ScrollBar(Scrollable(pile, force_forward_keypress=True), thumb_char="\u2503", trough_char=" "), "scrollbar")
|
||||
self.detect_partials()
|
||||
|
||||
def parse_url(self, url):
|
||||
path = None
|
||||
destination_hash = None
|
||||
components = url.split(":")
|
||||
if len(components) == 1:
|
||||
if len(components[0]) == (RNS.Reticulum.TRUNCATED_HASHLENGTH//8)*2:
|
||||
try: destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e: raise ValueError("Malformed URL")
|
||||
path = Browser.DEFAULT_PATH
|
||||
else: raise ValueError("Malformed URL")
|
||||
elif len(components) == 2:
|
||||
if len(components[0]) == (RNS.Reticulum.TRUNCATED_HASHLENGTH//8)*2:
|
||||
try: destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e: raise ValueError("Malformed URL")
|
||||
path = components[1]
|
||||
if len(path) == 0: path = Browser.DEFAULT_PATH
|
||||
else:
|
||||
if len(components[0]) == 0:
|
||||
if self.destination_hash != None:
|
||||
destination_hash = self.destination_hash
|
||||
path = components[1]
|
||||
if len(path) == 0: path = Browser.DEFAULT_PATH
|
||||
else: raise ValueError("Malformed URL")
|
||||
else: raise ValueError("Malformed URL")
|
||||
else: raise ValueError("Malformed URL")
|
||||
|
||||
return destination_hash, path
|
||||
|
||||
def detect_partials(self):
|
||||
for w in self.attr_maps:
|
||||
o = w._original_widget
|
||||
if hasattr(o, "partial_hash"):
|
||||
RNS.log(f"Found partial: {o.partial_hash} / {o.partial_url} / {o.partial_refresh}")
|
||||
partial = {"hash": o.partial_hash, "id": o.partial_id, "url": o.partial_url, "fields": o.partial_fields,
|
||||
"refresh": o.partial_refresh, "content": None, "updated": None, "update_requested": None, "request_id": None,
|
||||
"destination": None, "link": None, "pile": o, "attr_maps": None, "failed": False, "pr_throttle": 0}
|
||||
|
||||
self.page_partials[o.partial_hash] = partial
|
||||
|
||||
if len(self.page_partials) > 0: self.start_partial_updater()
|
||||
|
||||
def partial_failed(self, request_receipt):
|
||||
RNS.log("Loading page partial failed", RNS.LOG_ERROR)
|
||||
for pid in self.page_partials:
|
||||
partial = self.page_partials[pid]
|
||||
if partial["request_id"] == request_receipt.request_id:
|
||||
try:
|
||||
partial["updated"] = time.time()
|
||||
partial["request_id"] = None
|
||||
partial["content"] = None
|
||||
partial["attr_maps"] = None
|
||||
url = partial["url"]
|
||||
pile = partial["pile"]
|
||||
pile.contents = [(urwid.Text(f"Could not load partial {url}: The resource transfer failed"), pile.options())]
|
||||
except Exception as e:
|
||||
RNS.log(f"Error in partial failed callback: {e}", RNS.LOG_ERROR)
|
||||
RNS.trace_exception(e)
|
||||
|
||||
def partial_progressed(self, request_receipt):
|
||||
pass
|
||||
|
||||
def partial_received(self, request_receipt):
|
||||
for pid in self.page_partials:
|
||||
partial = self.page_partials[pid]
|
||||
if partial["request_id"] == request_receipt.request_id:
|
||||
try:
|
||||
partial["updated"] = partial["update_requested"]
|
||||
partial["request_id"] = None
|
||||
partial["content"] = request_receipt.response.decode("utf-8").rstrip()
|
||||
partial["attr_maps"] = markup_to_attrmaps(strip_modifiers(partial["content"]), url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
pile = partial["pile"]
|
||||
pile.contents = [(e, pile.options()) for e in partial["attr_maps"]]
|
||||
|
||||
except Exception as e:
|
||||
RNS.trace_exception(e)
|
||||
|
||||
def __load_partial(self, partial):
|
||||
if partial["failed"] == True: return
|
||||
try: partial_destination_hash, path = self.parse_url(partial["url"])
|
||||
except Exception as e:
|
||||
RNS.log(f"Could not parse partial URL: {e}", RNS.LOG_ERROR)
|
||||
partial["failed"] = True
|
||||
pile = partial["pile"]
|
||||
url = partial["url"]
|
||||
pile.contents = [(urwid.Text(f"Could not load partial {url}: {e}"), pile.options())]
|
||||
return
|
||||
|
||||
if partial_destination_hash != self.loopback and not RNS.Transport.has_path(partial_destination_hash):
|
||||
if time.time() <= partial["pr_throttle"]: return
|
||||
else:
|
||||
partial["pr_throttle"] = time.time()+15
|
||||
RNS.log(f"Requesting path for partial: {partial_destination_hash} / {path}", RNS.LOG_EXTREME)
|
||||
RNS.Transport.request_path(partial_destination_hash)
|
||||
pr_time = time.time()+RNS.Transport.first_hop_timeout(partial_destination_hash)
|
||||
while not RNS.Transport.has_path(partial_destination_hash):
|
||||
now = time.time()
|
||||
if now > pr_time+self.timeout: return
|
||||
time.sleep(0.25)
|
||||
|
||||
for pid in self.page_partials:
|
||||
other_partial = self.page_partials[pid]
|
||||
if other_partial["link"]:
|
||||
existing_link = other_partial["link"]
|
||||
if existing_link.destination.hash == partial_destination_hash and existing_link.status == RNS.Link.ACTIVE:
|
||||
RNS.log(f"Re-using existing link: {existing_link}", RNS.LOG_EXTREME)
|
||||
partial["link"] = existing_link
|
||||
break
|
||||
|
||||
if not partial["link"] or partial["link"].status == RNS.Link.CLOSED:
|
||||
RNS.log(f"Establishing link for partial: {partial_destination_hash} / {path}", RNS.LOG_EXTREME)
|
||||
identity = RNS.Identity.recall(partial_destination_hash)
|
||||
destination = RNS.Destination(identity, RNS.Destination.OUT, RNS.Destination.SINGLE, self.app_name, self.aspects)
|
||||
|
||||
def established(link):
|
||||
RNS.log(f"Link established for partial: {partial_destination_hash} / {path}", RNS.LOG_EXTREME)
|
||||
|
||||
def closed(link):
|
||||
RNS.log(f"Link closed for partial: {partial_destination_hash} / {path}", RNS.LOG_EXTREME)
|
||||
partial["link"] = None
|
||||
|
||||
partial["link"] = RNS.Link(destination, established_callback = established, closed_callback = closed)
|
||||
timeout = time.time()+self.timeout
|
||||
while partial["link"].status != RNS.Link.ACTIVE and time.time() < timeout: time.sleep(0.1)
|
||||
|
||||
if partial["link"] and partial["link"].status == RNS.Link.ACTIVE and partial["request_id"] == None:
|
||||
RNS.log(f"Sending request for partial: {partial_destination_hash} / {path}", RNS.LOG_EXTREME)
|
||||
receipt = partial["link"].request(path, data=self.__get_partial_request_data(partial), response_callback = self.partial_received,
|
||||
failed_callback = self.partial_failed, progress_callback = self.partial_progressed)
|
||||
|
||||
if receipt: partial["request_id"] = receipt.request_id
|
||||
else: RNS.log(f"Partial request failed", RNS.LOG_ERROR)
|
||||
|
||||
def __get_partial_request_data(self, partial):
|
||||
request_data = None
|
||||
if partial["fields"] != None:
|
||||
link_data = partial["fields"]
|
||||
link_fields = []
|
||||
request_data = {}
|
||||
all_fields = True if "*" in link_data else False
|
||||
|
||||
for e in link_data:
|
||||
if "=" in e:
|
||||
c = e.split("=")
|
||||
if len(c) == 2:
|
||||
request_data["var_"+str(c[0])] = str(c[1])
|
||||
else:
|
||||
link_fields.append(e)
|
||||
|
||||
def recurse_down(w):
|
||||
if isinstance(w, list):
|
||||
for t in w:
|
||||
recurse_down(t)
|
||||
elif isinstance(w, tuple):
|
||||
for t in w:
|
||||
recurse_down(t)
|
||||
elif hasattr(w, "contents"):
|
||||
recurse_down(w.contents)
|
||||
elif hasattr(w, "original_widget"):
|
||||
recurse_down(w.original_widget)
|
||||
elif hasattr(w, "_original_widget"):
|
||||
recurse_down(w._original_widget)
|
||||
else:
|
||||
if hasattr(w, "field_name") and (all_fields or w.field_name in link_fields):
|
||||
field_key = "field_" + w.field_name
|
||||
if isinstance(w, urwid.Edit):
|
||||
request_data[field_key] = w.edit_text
|
||||
elif isinstance(w, urwid.RadioButton):
|
||||
if w.state:
|
||||
user_data = getattr(w, "field_value", None)
|
||||
if user_data is not None:
|
||||
request_data[field_key] = user_data
|
||||
elif isinstance(w, urwid.CheckBox):
|
||||
user_data = getattr(w, "field_value", "1")
|
||||
if w.state:
|
||||
existing_value = request_data.get(field_key, '')
|
||||
if existing_value:
|
||||
# Concatenate the new value with the existing one
|
||||
request_data[field_key] = existing_value + ',' + user_data
|
||||
else:
|
||||
# Initialize the field with the current value
|
||||
request_data[field_key] = user_data
|
||||
else:
|
||||
pass # do nothing if checkbox is not check
|
||||
|
||||
recurse_down(self.attr_maps)
|
||||
RNS.log("Including request data: "+str(request_data), RNS.LOG_DEBUG)
|
||||
|
||||
return request_data
|
||||
|
||||
def start_partial_updater(self):
|
||||
if not self.updater_running: self.update_partials()
|
||||
|
||||
def handle_partial_updates(self, partial_ids):
|
||||
RNS.log(f"Update partials: {partial_ids}")
|
||||
def job():
|
||||
for pid in self.page_partials:
|
||||
try:
|
||||
partial = self.page_partials[pid]
|
||||
if partial["id"] in partial_ids:
|
||||
partial["update_requested"] = time.time()
|
||||
self.__load_partial(partial)
|
||||
except Exception as e: RNS.log(f"Error updating page partial: {e}", RNS.LOG_ERROR)
|
||||
|
||||
threading.Thread(target=job, daemon=True).start()
|
||||
|
||||
def update_partials(self, loop=None, user_data=None):
|
||||
with self.partial_updater_lock:
|
||||
def job():
|
||||
for pid in self.page_partials:
|
||||
try:
|
||||
partial = self.page_partials[pid]
|
||||
if partial["failed"]: continue
|
||||
if not partial["updated"] or (partial["refresh"] != None and time.time() > partial["updated"]+partial["refresh"]):
|
||||
partial["update_requested"] = time.time()
|
||||
self.__load_partial(partial)
|
||||
except Exception as e: RNS.log(f"Error updating page partial: {e}", RNS.LOG_ERROR)
|
||||
|
||||
threading.Thread(target=job, daemon=True).start()
|
||||
|
||||
if len(self.page_partials) > 0:
|
||||
self.updater_running = True
|
||||
self.app.ui.loop.set_alarm_in(1, self.update_partials)
|
||||
else:
|
||||
self.updater_running = False
|
||||
|
||||
def identify(self):
|
||||
if self.link != None:
|
||||
@@ -486,35 +723,25 @@ class Browser:
|
||||
components = url.split(":")
|
||||
if len(components) == 1:
|
||||
if len(components[0]) == (RNS.Reticulum.TRUNCATED_HASHLENGTH//8)*2:
|
||||
try:
|
||||
destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e:
|
||||
raise ValueError("Malformed URL")
|
||||
try: destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e: raise ValueError("Malformed URL")
|
||||
path = Browser.DEFAULT_PATH
|
||||
else:
|
||||
raise ValueError("Malformed URL")
|
||||
else: raise ValueError("Malformed URL")
|
||||
elif len(components) == 2:
|
||||
if len(components[0]) == (RNS.Reticulum.TRUNCATED_HASHLENGTH//8)*2:
|
||||
try:
|
||||
destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e:
|
||||
raise ValueError("Malformed URL")
|
||||
try: destination_hash = bytes.fromhex(components[0])
|
||||
except Exception as e: raise ValueError("Malformed URL")
|
||||
path = components[1]
|
||||
if len(path) == 0:
|
||||
path = Browser.DEFAULT_PATH
|
||||
if len(path) == 0: path = Browser.DEFAULT_PATH
|
||||
else:
|
||||
if len(components[0]) == 0:
|
||||
if self.destination_hash != None:
|
||||
destination_hash = self.destination_hash
|
||||
path = components[1]
|
||||
if len(path) == 0:
|
||||
path = Browser.DEFAULT_PATH
|
||||
else:
|
||||
raise ValueError("Malformed URL")
|
||||
else:
|
||||
raise ValueError("Malformed URL")
|
||||
else:
|
||||
raise ValueError("Malformed URL")
|
||||
if len(path) == 0: path = Browser.DEFAULT_PATH
|
||||
else: raise ValueError("Malformed URL")
|
||||
else: raise ValueError("Malformed URL")
|
||||
else: raise ValueError("Malformed URL")
|
||||
|
||||
if destination_hash != None and path != None:
|
||||
if path.startswith("/file/"):
|
||||
@@ -697,7 +924,7 @@ class Browser:
|
||||
|
||||
def confirmed(sender):
|
||||
try:
|
||||
self.retrieve_url(e_url.get_edit_text())
|
||||
self.retrieve_url(e_url.get_edit_text().strip())
|
||||
except Exception as e:
|
||||
self.browser_footer = urwid.Text("Could not open link: "+str(e))
|
||||
self.frame.contents["footer"] = (self.browser_footer, self.frame.options())
|
||||
@@ -799,7 +1026,7 @@ class Browser:
|
||||
|
||||
self.page_background_color = None
|
||||
bgpos = self.markup.find("#!bg=")
|
||||
if bgpos:
|
||||
if bgpos >= 0:
|
||||
endpos = self.markup.find("\n", bgpos)
|
||||
if endpos-(bgpos+5) == 3:
|
||||
bg = self.markup[bgpos+5:endpos]
|
||||
@@ -807,13 +1034,13 @@ class Browser:
|
||||
|
||||
self.page_foreground_color = None
|
||||
fgpos = self.markup.find("#!fg=")
|
||||
if fgpos:
|
||||
if fgpos >= 0:
|
||||
endpos = self.markup.find("\n", fgpos)
|
||||
if endpos-(fgpos+5) == 3:
|
||||
fg = self.markup[fgpos+5:endpos]
|
||||
self.page_foreground_color = fg
|
||||
|
||||
self.attr_maps = markup_to_attrmaps(self.markup, url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
self.attr_maps = markup_to_attrmaps(strip_modifiers(self.markup), url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
|
||||
self.response_progress = 0
|
||||
self.response_speed = None
|
||||
@@ -866,7 +1093,7 @@ class Browser:
|
||||
|
||||
self.page_background_color = None
|
||||
bgpos = self.markup.find("#!bg=")
|
||||
if bgpos:
|
||||
if bgpos >= 0:
|
||||
endpos = self.markup.find("\n", bgpos)
|
||||
if endpos-(bgpos+5) == 3:
|
||||
bg = self.markup[bgpos+5:endpos]
|
||||
@@ -874,13 +1101,13 @@ class Browser:
|
||||
|
||||
self.page_foreground_color = None
|
||||
fgpos = self.markup.find("#!fg=")
|
||||
if fgpos:
|
||||
if fgpos >= 0:
|
||||
endpos = self.markup.find("\n", fgpos)
|
||||
if endpos-(fgpos+5) == 3:
|
||||
fg = self.markup[fgpos+5:endpos]
|
||||
self.page_foreground_color = fg
|
||||
|
||||
self.attr_maps = markup_to_attrmaps(self.markup, url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
self.attr_maps = markup_to_attrmaps(strip_modifiers(self.markup), url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
|
||||
self.response_progress = 0
|
||||
self.response_speed = None
|
||||
@@ -1018,7 +1245,7 @@ class Browser:
|
||||
|
||||
self.page_background_color = None
|
||||
bgpos = self.markup.find("#!bg=")
|
||||
if bgpos:
|
||||
if bgpos >= 0:
|
||||
endpos = self.markup.find("\n", bgpos)
|
||||
if endpos-(bgpos+5) == 3:
|
||||
bg = self.markup[bgpos+5:endpos]
|
||||
@@ -1026,13 +1253,13 @@ class Browser:
|
||||
|
||||
self.page_foreground_color = None
|
||||
fgpos = self.markup.find("#!fg=")
|
||||
if fgpos:
|
||||
if fgpos >= 0:
|
||||
endpos = self.markup.find("\n", fgpos)
|
||||
if endpos-(fgpos+5) == 3:
|
||||
fg = self.markup[fgpos+5:endpos]
|
||||
self.page_foreground_color = fg
|
||||
|
||||
self.attr_maps = markup_to_attrmaps(self.markup, url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
self.attr_maps = markup_to_attrmaps(strip_modifiers(self.markup), url_delegate=self, fg_color=self.page_foreground_color, bg_color=self.page_background_color)
|
||||
self.response_progress = 0
|
||||
self.response_speed = None
|
||||
self.progress_updated_at = None
|
||||
|
||||
@@ -331,7 +331,7 @@ class ConversationsDisplay():
|
||||
existing_conversations = nomadnet.Conversation.conversation_list(self.app)
|
||||
|
||||
display_name = e_name.get_edit_text()
|
||||
source_hash_text = e_id.get_edit_text()
|
||||
source_hash_text = e_id.get_edit_text().strip()
|
||||
source_hash = bytes.fromhex(source_hash_text)
|
||||
trust_level = DirectoryEntry.UNTRUSTED
|
||||
if r_unknown.state == True:
|
||||
@@ -412,7 +412,7 @@ class ConversationsDisplay():
|
||||
try:
|
||||
local_delivery_signal = "local_delivery_occurred"
|
||||
duplicate_signal = "duplicate_lxm"
|
||||
lxm_uri = e_uri.get_edit_text()
|
||||
lxm_uri = e_uri.get_edit_text().strip()
|
||||
|
||||
ingest_result = self.app.message_router.ingest_lxm_uri(
|
||||
lxm_uri,
|
||||
|
||||
@@ -1455,6 +1455,36 @@ This line will
|
||||
``
|
||||
|
||||
|
||||
>Partials
|
||||
|
||||
You can include partials in pages, which will load asynchronously once the page itself has loaded.
|
||||
|
||||
`Faaa
|
||||
`=
|
||||
`{f64a846313b874ee4a357040807f8c77:/page/partial_1.mu}
|
||||
`=
|
||||
``
|
||||
|
||||
It's also possible to set an auto-refresh interval for partials. Omit or set to 0 to disable. The following partial will update every 10 seconds.
|
||||
|
||||
`Faaa
|
||||
`=
|
||||
`{f64a846313b874ee4a357040807f8c77:/page/refreshing_partial.mu`10}
|
||||
`=
|
||||
``
|
||||
|
||||
You can include field values and variables in partial updates, and by setting the `!pid`! variable, you can create links that update one or more specific partials.
|
||||
|
||||
`Faaa
|
||||
`=
|
||||
Name: `B444`<user_name`>`b
|
||||
|
||||
`F38a`[Say hello`p:32]`f
|
||||
|
||||
`{f64a846313b874e84a357039807f8c77:/page/hello_partial.mu`0`pid=32|user_name}
|
||||
`=
|
||||
``
|
||||
|
||||
>Literals
|
||||
|
||||
To display literal content, for example source-code, or blocks of text that should not be interpreted by micron, you can use literal blocks, specified by the \\`= tag. Below is the source code of this entire document, presented as a literal block.
|
||||
|
||||
0
nomadnet/ui/textui/Helpers.py
Normal file
0
nomadnet/ui/textui/Helpers.py
Normal file
@@ -2,6 +2,7 @@ import nomadnet
|
||||
import urwid
|
||||
import random
|
||||
import time
|
||||
import RNS
|
||||
from urwid.util import is_mouse_press
|
||||
from urwid.text_layout import calc_coords
|
||||
|
||||
@@ -85,6 +86,54 @@ def markup_to_attrmaps(markup, url_delegate = None, fg_color=None, bg_color=None
|
||||
|
||||
return attrmaps
|
||||
|
||||
def parse_partial(line):
|
||||
try:
|
||||
endpos = line.find("}")
|
||||
if endpos == -1: return None
|
||||
else:
|
||||
partial_data = line[0:endpos]
|
||||
|
||||
partial_id = None
|
||||
partial_components = partial_data.split("`")
|
||||
if len(partial_components) == 1:
|
||||
partial_url = partial_components[0]
|
||||
partial_refresh = None
|
||||
partial_fields = ""
|
||||
elif len(partial_components) == 2:
|
||||
partial_url = partial_components[0]
|
||||
partial_refresh = float(partial_components[1])
|
||||
partial_fields = ""
|
||||
elif len(partial_components) == 3:
|
||||
partial_url = partial_components[0]
|
||||
partial_refresh = float(partial_components[1])
|
||||
partial_fields = partial_components[2]
|
||||
else:
|
||||
partial_url = ""
|
||||
partial_fields = ""
|
||||
partial_refresh = None
|
||||
|
||||
if partial_refresh != None and partial_refresh < 1: partial_refresh = None
|
||||
|
||||
pf = partial_fields.split("|")
|
||||
if len(pf) > 0:
|
||||
partial_fields = pf
|
||||
for f in pf:
|
||||
if f.startswith("pid="):
|
||||
pcs = f.split("=")
|
||||
partial_id = pcs[1]
|
||||
|
||||
if len(partial_url):
|
||||
pile = urwid.Pile([urwid.Text(f"⧖")])
|
||||
partial_descriptor = "|".join(partial_components)
|
||||
pile.partial_id = partial_id
|
||||
pile.partial_hash = RNS.hexrep(RNS.Identity.full_hash(partial_descriptor.encode("utf-8")), delimit=False)
|
||||
pile.partial_url = partial_url
|
||||
pile.partial_fields = partial_fields
|
||||
pile.partial_refresh = partial_refresh
|
||||
return [pile]
|
||||
|
||||
except Exception as e: return None
|
||||
|
||||
def parse_line(line, state, url_delegate):
|
||||
pre_escape = False
|
||||
if len(line) > 0:
|
||||
@@ -106,6 +155,10 @@ def parse_line(line, state, url_delegate):
|
||||
elif first_char == "#":
|
||||
return None
|
||||
|
||||
# Check for partials
|
||||
elif line.startswith("`{"):
|
||||
return parse_partial(line[2:])
|
||||
|
||||
# Check for section heading reset
|
||||
elif first_char == "<":
|
||||
state["depth"] = 0
|
||||
@@ -283,14 +336,11 @@ def make_style(state):
|
||||
|
||||
if color[0] == "g":
|
||||
val = int(color[1:2])
|
||||
if val < 25:
|
||||
result = "black"
|
||||
elif val < 50:
|
||||
result = "dark gray"
|
||||
elif val < 75:
|
||||
result = "light gray"
|
||||
else:
|
||||
result = "white"
|
||||
if val < 25: result = "black"
|
||||
elif val < 50: result = "dark gray"
|
||||
elif val < 75: result = "light gray"
|
||||
else: result = "white"
|
||||
|
||||
else:
|
||||
r = int(color[0], 16)
|
||||
g = int(color[1], 16)
|
||||
@@ -298,65 +348,43 @@ def make_style(state):
|
||||
|
||||
if r == g == b:
|
||||
val = int(color[0], 16)*6
|
||||
if val < 12:
|
||||
result = "black"
|
||||
elif val < 50:
|
||||
result = "dark gray"
|
||||
elif val < 80:
|
||||
result = "light gray"
|
||||
else:
|
||||
result = "white"
|
||||
if val < 12: result = "black"
|
||||
elif val < 50: result = "dark gray"
|
||||
elif val < 80: result = "light gray"
|
||||
else: result = "white"
|
||||
|
||||
else:
|
||||
if r == b:
|
||||
if r > g:
|
||||
if r > t:
|
||||
result = "light magenta"
|
||||
else:
|
||||
result = "dark magenta"
|
||||
if r > t: result = "light magenta"
|
||||
else: result = "dark magenta"
|
||||
else:
|
||||
if g > t:
|
||||
result = "light green"
|
||||
else:
|
||||
result = "dark green"
|
||||
if g > t: result = "light green"
|
||||
else: result = "dark green"
|
||||
if b == g:
|
||||
if b > r:
|
||||
if b > t:
|
||||
result = "light cyan"
|
||||
else:
|
||||
result = "dark cyan"
|
||||
if b > t: result = "light cyan"
|
||||
else: result = "dark cyan"
|
||||
else:
|
||||
if r > t:
|
||||
result = "light red"
|
||||
else:
|
||||
result = "dark red"
|
||||
if r > t: result = "light red"
|
||||
else: result = "dark red"
|
||||
if g == r:
|
||||
if g > b:
|
||||
if g > t:
|
||||
result = "yellow"
|
||||
else:
|
||||
result = "brown"
|
||||
if g > t: result = "yellow"
|
||||
else: result = "brown"
|
||||
else:
|
||||
if b > t:
|
||||
result = "light blue"
|
||||
else:
|
||||
result = "dark blue"
|
||||
if b > t: result = "light blue"
|
||||
else: result = "dark blue"
|
||||
|
||||
if r > g and r > b:
|
||||
if r > t:
|
||||
result = "light red"
|
||||
else:
|
||||
result = "dark red"
|
||||
if r > t: result = "light red"
|
||||
else: result = "dark red"
|
||||
if g > r and g > b:
|
||||
if g > t:
|
||||
result = "light green"
|
||||
else:
|
||||
result = "dark green"
|
||||
if g > t: result = "light green"
|
||||
else: result = "dark green"
|
||||
if b > g and b > r:
|
||||
if b > t:
|
||||
result = "light blue"
|
||||
else:
|
||||
result = "dark blue"
|
||||
if b > t: result = "light blue"
|
||||
else: result = "dark blue"
|
||||
|
||||
except Exception as e:
|
||||
result = "default"
|
||||
@@ -422,12 +450,9 @@ def make_style(state):
|
||||
bg = state["bg_color"]
|
||||
|
||||
format_string = ""
|
||||
if bold:
|
||||
format_string += ",bold"
|
||||
if underline:
|
||||
format_string += ",underline"
|
||||
if italic:
|
||||
format_string += ",italics"
|
||||
if bold: format_string += ",bold"
|
||||
if underline: format_string += ",underline"
|
||||
if italic: format_string += ",italics"
|
||||
|
||||
name = "micron_"+fg+"_"+bg+"_"+format_string
|
||||
if not name in SYNTH_STYLES:
|
||||
@@ -487,20 +512,11 @@ def make_output(state, line, url_delegate, pre_escape=False):
|
||||
state["bg_color"] = state["default_bg"]
|
||||
state["align"] = state["default_align"]
|
||||
elif c == "c":
|
||||
if state["align"] != "center":
|
||||
state["align"] = "center"
|
||||
else:
|
||||
state["align"] = state["default_align"]
|
||||
if state["align"] != "center": state["align"] = "center"
|
||||
elif c == "l":
|
||||
if state["align"] != "left":
|
||||
state["align"] = "left"
|
||||
else:
|
||||
state["align"] = state["default_align"]
|
||||
if state["align"] != "left": state["align"] = "left"
|
||||
elif c == "r":
|
||||
if state["align"] != "right":
|
||||
state["align"] = "right"
|
||||
else:
|
||||
state["align"] = state["default_align"]
|
||||
if state["align"] != "right": state["align"] = "right"
|
||||
elif c == "a":
|
||||
state["align"] = state["default_align"]
|
||||
|
||||
@@ -649,7 +665,7 @@ def make_output(state, line, url_delegate, pre_escape=False):
|
||||
orig_spec = speclist[4]
|
||||
|
||||
if url_delegate != None:
|
||||
linkspec = LinkSpec(link_url, orig_spec)
|
||||
linkspec = LinkSpec(link_url, orig_spec, cm=cm)
|
||||
if link_fields != "":
|
||||
lf = link_fields.split("|")
|
||||
if len(lf) > 0:
|
||||
@@ -657,9 +673,7 @@ def make_output(state, line, url_delegate, pre_escape=False):
|
||||
|
||||
output.append((linkspec, link_label))
|
||||
else:
|
||||
output.append(make_part(state, link_label))
|
||||
|
||||
|
||||
output.append(make_part(state, link_label))
|
||||
|
||||
mode = "text"
|
||||
if len(part) > 0:
|
||||
@@ -696,11 +710,11 @@ def make_output(state, line, url_delegate, pre_escape=False):
|
||||
|
||||
|
||||
class LinkSpec(urwid.AttrSpec):
|
||||
def __init__(self, link_target, orig_spec):
|
||||
def __init__(self, link_target, orig_spec, cm=256):
|
||||
self.link_target = link_target
|
||||
self.link_fields = None
|
||||
|
||||
super().__init__(orig_spec.foreground, orig_spec.background)
|
||||
super().__init__(orig_spec.foreground, orig_spec.background, colors=cm)
|
||||
|
||||
|
||||
class LinkableText(urwid.Text):
|
||||
|
||||
@@ -6,6 +6,7 @@ import threading
|
||||
from datetime import datetime
|
||||
from nomadnet.Directory import DirectoryEntry
|
||||
from nomadnet.vendor.additional_urwid_widgets import IndicativeListBox, MODIFIER_KEY
|
||||
from nomadnet.util import strip_modifiers
|
||||
|
||||
from .Browser import Browser
|
||||
|
||||
@@ -84,7 +85,7 @@ class AnnounceInfo(urwid.WidgetWrap):
|
||||
type_string = "Peer " + g["peer"]
|
||||
|
||||
try:
|
||||
data_str = announce[2].decode("utf-8")
|
||||
data_str = strip_modifiers(announce[2].decode("utf-8"))
|
||||
data_style = ""
|
||||
if trust_level != DirectoryEntry.TRUSTED and len(data_str) > 32:
|
||||
data_str = data_str[:32]+" [...]"
|
||||
@@ -250,7 +251,7 @@ class AnnounceInfo(urwid.WidgetWrap):
|
||||
|
||||
|
||||
class AnnounceStreamEntry(urwid.WidgetWrap):
|
||||
def __init__(self, app, announce, delegate):
|
||||
def __init__(self, app, announce, delegate, show_destination=False):
|
||||
full_time_format = "%Y-%m-%d %H:%M:%S"
|
||||
date_time_format = "%Y-%m-%d"
|
||||
time_time_format = "%H:%M:%S"
|
||||
@@ -274,7 +275,16 @@ class AnnounceStreamEntry(urwid.WidgetWrap):
|
||||
ts_string = dt.strftime(date_only_format)
|
||||
|
||||
trust_level = self.app.directory.trust_level(source_hash)
|
||||
display_str = self.app.directory.simplest_display_str(source_hash)
|
||||
|
||||
if show_destination:
|
||||
display_str = RNS.hexrep(source_hash, delimit=False)
|
||||
else:
|
||||
try:
|
||||
display_str = strip_modifiers(announce[2].decode("utf-8"))
|
||||
if len(display_str) > 32:
|
||||
display_str = display_str[:32] + "..."
|
||||
except:
|
||||
display_str = self.app.directory.simplest_display_str(source_hash)
|
||||
|
||||
if trust_level == DirectoryEntry.UNTRUSTED:
|
||||
symbol = g["cross"]
|
||||
@@ -381,22 +391,33 @@ class AnnounceStream(urwid.WidgetWrap):
|
||||
self.ilb = None
|
||||
self.no_content = True
|
||||
self.current_tab = "nodes"
|
||||
self.show_destination = False
|
||||
self.search_text = ""
|
||||
|
||||
self.added_entries = []
|
||||
self.widget_list = []
|
||||
self.update_widget_list()
|
||||
|
||||
# Create tab buttons
|
||||
self.tab_nodes = TabButton("Nodes", on_press=self.show_nodes_tab)
|
||||
self.tab_peers = TabButton("Peers", on_press=self.show_peers_tab)
|
||||
self.tab_pn = TabButton("Propagation Nodes", on_press=self.show_pn_tab)
|
||||
self.tab_nodes = TabButton("Nodes (0)", on_press=self.show_nodes_tab)
|
||||
self.tab_peers = TabButton("Peers (0)", on_press=self.show_peers_tab)
|
||||
self.tab_pn = TabButton("Propagation Nodes (0)", on_press=self.show_pn_tab)
|
||||
|
||||
# Create tab bar with proportional widths
|
||||
self.tab_bar = urwid.Columns([
|
||||
('weight', 1, self.tab_nodes),
|
||||
('weight', 1, self.tab_peers),
|
||||
('weight', 3, self.tab_pn),
|
||||
], dividechars=1) # Add 1 character spacing between tabs
|
||||
], dividechars=1)
|
||||
|
||||
self.search_edit = urwid.Edit(caption="Search: ")
|
||||
urwid.connect_signal(self.search_edit, 'change', self.on_search_change)
|
||||
|
||||
self.display_toggle = TabButton("Show: Name", on_press=self.toggle_display_mode)
|
||||
|
||||
self.filter_bar = urwid.Columns([
|
||||
('weight', 2, self.search_edit),
|
||||
('weight', 1, self.display_toggle),
|
||||
], dividechars=1)
|
||||
|
||||
self.update_widget_list()
|
||||
|
||||
self.ilb = ExceptionHandlingListBox(
|
||||
self.widget_list,
|
||||
@@ -406,9 +427,9 @@ class AnnounceStream(urwid.WidgetWrap):
|
||||
#highlight_offFocus="list_off_focus"
|
||||
)
|
||||
|
||||
# Combine tab bar and list box
|
||||
self.pile = urwid.Pile([
|
||||
('pack', self.tab_bar),
|
||||
('pack', self.filter_bar),
|
||||
('weight', 1, self.ilb),
|
||||
])
|
||||
|
||||
@@ -416,13 +437,25 @@ class AnnounceStream(urwid.WidgetWrap):
|
||||
super().__init__(urwid.LineBox(self.display_widget, title="Announce Stream"))
|
||||
|
||||
def keypress(self, size, key):
|
||||
if key == "up" and (self.no_content or self.ilb.first_item_is_selected()):
|
||||
if key == "up" and self.pile.focus == self.tab_bar:
|
||||
nomadnet.NomadNetworkApp.get_shared_instance().ui.main_display.frame.focus_position = "header"
|
||||
elif key == "ctrl x":
|
||||
self.delete_selected_entry()
|
||||
|
||||
return super(AnnounceStream, self).keypress(size, key)
|
||||
|
||||
def on_search_change(self, widget, text):
|
||||
self.search_text = text.lower()
|
||||
self.update_widget_list()
|
||||
|
||||
def toggle_display_mode(self, button):
|
||||
self.show_destination = not self.show_destination
|
||||
if self.show_destination:
|
||||
self.display_toggle.set_label("Show: Dest")
|
||||
else:
|
||||
self.display_toggle.set_label("Show: Name")
|
||||
self.update_widget_list()
|
||||
|
||||
def delete_selected_entry(self):
|
||||
if self.ilb.get_selected_item() != None:
|
||||
self.app.directory.remove_announce_with_timestamp(self.ilb.get_selected_item().original_widget.timestamp)
|
||||
@@ -438,19 +471,36 @@ class AnnounceStream(urwid.WidgetWrap):
|
||||
self.widget_list = []
|
||||
new_entries = []
|
||||
|
||||
node_count = 0
|
||||
peer_count = 0
|
||||
pn_count = 0
|
||||
|
||||
for e in self.app.directory.announce_stream:
|
||||
announce_type = e[3]
|
||||
|
||||
# Filter based on current tab
|
||||
if self.current_tab == "nodes" and (announce_type == "node" or announce_type == True):
|
||||
new_entries.append(e)
|
||||
elif self.current_tab == "peers" and (announce_type == "peer" or announce_type == False):
|
||||
new_entries.append(e)
|
||||
elif self.current_tab == "pn" and announce_type == "pn":
|
||||
new_entries.append(e)
|
||||
if self.search_text:
|
||||
try:
|
||||
announce_data = e[2].decode("utf-8").lower()
|
||||
except:
|
||||
announce_data = ""
|
||||
if self.search_text not in announce_data:
|
||||
continue
|
||||
|
||||
if announce_type == "node" or announce_type == True:
|
||||
node_count += 1
|
||||
if self.current_tab == "nodes":
|
||||
new_entries.append(e)
|
||||
elif announce_type == "peer" or announce_type == False:
|
||||
peer_count += 1
|
||||
if self.current_tab == "peers":
|
||||
new_entries.append(e)
|
||||
elif announce_type == "pn":
|
||||
pn_count += 1
|
||||
if self.current_tab == "pn":
|
||||
new_entries.append(e)
|
||||
|
||||
for e in new_entries:
|
||||
nw = AnnounceStreamEntry(self.app, e, self)
|
||||
nw = AnnounceStreamEntry(self.app, e, self, show_destination=self.show_destination)
|
||||
nw.timestamp = e[0]
|
||||
self.widget_list.append(nw)
|
||||
|
||||
@@ -460,6 +510,10 @@ class AnnounceStream(urwid.WidgetWrap):
|
||||
self.no_content = True
|
||||
self.widget_list = [urwid.Text(f"No {self.current_tab} announces", align='center')]
|
||||
|
||||
self.tab_nodes.set_label(f"Nodes ({node_count})")
|
||||
self.tab_peers.set_label(f"Peers ({peer_count})")
|
||||
self.tab_pn.set_label(f"Propagation Nodes ({pn_count})")
|
||||
|
||||
if self.ilb:
|
||||
self.ilb.set_body(self.widget_list)
|
||||
|
||||
@@ -555,7 +609,7 @@ class KnownNodeInfo(urwid.WidgetWrap):
|
||||
if node_entry == None:
|
||||
display_str = self.app.directory.simplest_display_str(source_hash)
|
||||
else:
|
||||
display_str = node_entry.display_name
|
||||
display_str = strip_modifiers(node_entry.display_name)
|
||||
|
||||
addr_str = "<"+RNS.hexrep(source_hash, delimit=False)+">"
|
||||
|
||||
@@ -1648,7 +1702,6 @@ class NetworkDisplay():
|
||||
def reinit_known_nodes(self):
|
||||
self.known_nodes_display = KnownNodes(self.app)
|
||||
self.known_nodes_display.delegate = self
|
||||
self.close_list_dialogs()
|
||||
self.announce_stream_display.rebuild_widget_list()
|
||||
|
||||
def reinit_lxmf_peers(self):
|
||||
@@ -1843,15 +1896,24 @@ class LXMFPeerEntry(urwid.WidgetWrap):
|
||||
style = "list_unresponsive"
|
||||
focus_style = "list_focus_unresponsive"
|
||||
|
||||
if peer.propagation_transfer_limit:
|
||||
txfer_limit = RNS.prettysize(peer.propagation_transfer_limit*1000)
|
||||
else:
|
||||
txfer_limit = "No"
|
||||
if peer.propagation_transfer_limit: txfer_limit = RNS.prettysize(peer.propagation_transfer_limit*1000)
|
||||
else: txfer_limit = "No"
|
||||
|
||||
if peer.propagation_sync_limit: sync_limit = RNS.prettysize(peer.propagation_sync_limit*1000)
|
||||
else: sync_limit = "Unknown"
|
||||
|
||||
if peer.propagation_stamp_cost: sct = peer.propagation_stamp_cost
|
||||
else: sct = "Unknown"
|
||||
|
||||
if peer.propagation_stamp_cost_flexibility: scf = f" (flex {peer.propagation_stamp_cost_flexibility})"
|
||||
else: scf = ""
|
||||
|
||||
ar = round(peer.acceptance_rate*100, 2)
|
||||
peer_info_str = sym+" "+display_str+"\n "+alive_string+", last heard "+pretty_date(int(peer.last_heard))
|
||||
peer_info_str += "\n "+str(peer.unhandled_message_count)+f" unhandled LXMs, {txfer_limit} sync limit\n"
|
||||
peer_info_str += f" {RNS.prettyspeed(peer.sync_transfer_rate)} STR, "
|
||||
peer_info_str += f"{RNS.prettyspeed(peer.link_establishment_rate)} LER, {ar}% AR\n"
|
||||
peer_info_str += f"\n {sync_limit} sync limit, {txfer_limit} msg limit"
|
||||
peer_info_str += f"\n {RNS.prettyspeed(peer.sync_transfer_rate)} STR, {RNS.prettyspeed(peer.link_establishment_rate)} LER"
|
||||
peer_info_str += f"\n Propagation cost {sct}{scf}"
|
||||
peer_info_str += "\n "+str(peer.unhandled_message_count)+f" unhandled LXMs, {ar}% AR"
|
||||
widget = ListEntry(peer_info_str)
|
||||
self.display_widget = urwid.AttrMap(widget, style, focus_style)
|
||||
self.display_widget.destination_hash = destination_hash
|
||||
@@ -1899,4 +1961,4 @@ def pretty_date(time=False):
|
||||
return str(int(day_diff / 7)) + " weeks ago"
|
||||
if day_diff < 365:
|
||||
return str(int(day_diff / 30)) + " months ago"
|
||||
return str(int(day_diff / 365)) + " years ago"
|
||||
return str(int(day_diff / 365)) + " years ago"
|
||||
@@ -1,5 +1,7 @@
|
||||
import os
|
||||
import glob
|
||||
|
||||
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
__all__ = [ os.path.basename(f)[:-3] for f in modules if not f.endswith('__init__.py')]
|
||||
py_modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
pyc_modules = glob.glob(os.path.dirname(__file__)+"/*.pyc")
|
||||
modules = py_modules+pyc_modules
|
||||
__all__ = list(set([os.path.basename(f).replace(".pyc", "").replace(".py", "") for f in modules if not (f.endswith("__init__.py") or f.endswith("__init__.pyc"))]))
|
||||
|
||||
35
nomadnet/util.py
Normal file
35
nomadnet/util.py
Normal file
@@ -0,0 +1,35 @@
|
||||
import re
|
||||
import unicodedata
|
||||
|
||||
invalid_rendering = ["🕵️", "☝"]
|
||||
|
||||
def strip_modifiers(text):
|
||||
def process_characters(text):
|
||||
result = []
|
||||
i = 0
|
||||
while i < len(text):
|
||||
char = text[i]
|
||||
category = unicodedata.category(char)
|
||||
|
||||
if category.startswith(('L', 'N', 'P', 'S')):
|
||||
result.append(char)
|
||||
i += 1
|
||||
elif category.startswith(('M', 'Sk', 'Cf')) or char in '\u200d\u200c':
|
||||
i += 1
|
||||
else:
|
||||
result.append(char)
|
||||
i += 1
|
||||
|
||||
return ''.join(result)
|
||||
|
||||
for char in invalid_rendering:
|
||||
text = text.replace(char, " ")
|
||||
|
||||
stripped = process_characters(text)
|
||||
stripped = re.sub(r'[\uFE00-\uFE0F]', '', stripped)
|
||||
stripped = re.sub(r'[\U000E0100-\U000E01EF]', '', stripped, flags=re.UNICODE)
|
||||
stripped = re.sub(r'[\U0001F3FB-\U0001F3FF]', '', stripped, flags=re.UNICODE)
|
||||
stripped = re.sub(r'[\u200D\u200C]', '', stripped)
|
||||
stripped = re.sub(r'\r\n?', '\n', stripped)
|
||||
|
||||
return stripped
|
||||
6
nomadnet/vendor/__init__.py
vendored
6
nomadnet/vendor/__init__.py
vendored
@@ -1,5 +1,7 @@
|
||||
import os
|
||||
import glob
|
||||
|
||||
modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
__all__ = [ os.path.basename(f)[:-3] for f in modules if not f.endswith('__init__.py')]
|
||||
py_modules = glob.glob(os.path.dirname(__file__)+"/*.py")
|
||||
pyc_modules = glob.glob(os.path.dirname(__file__)+"/*.pyc")
|
||||
modules = py_modules+pyc_modules
|
||||
__all__ = list(set([os.path.basename(f).replace(".pyc", "").replace(".py", "") for f in modules if not (f.endswith("__init__.py") or f.endswith("__init__.pyc"))]))
|
||||
|
||||
2
setup.py
2
setup.py
@@ -30,6 +30,6 @@ setuptools.setup(
|
||||
entry_points= {
|
||||
'console_scripts': ['nomadnet=nomadnet.nomadnet:main']
|
||||
},
|
||||
install_requires=["rns>=1.0.0", "lxmf>=0.8.0", "urwid>=2.6.16", "qrcode"],
|
||||
install_requires=["rns>=1.0.4", "lxmf>=0.9.3", "urwid>=2.6.16", "qrcode"],
|
||||
python_requires=">=3.7",
|
||||
)
|
||||
|
||||
Reference in New Issue
Block a user