โฉ– powrelay.xyz

day week month year all
3 thousand hashes per byte
so what is the real issue with spam? you say its that it makes global feed unreadable, and solution is to use wot? nope, thats not the real issue. the real issue with spam is that is makes relays impossible to operate due to massive increase in requires bandwidth and storage. this is why client side only filters do not save nostr.
Created at:
Thu Mar 13 19:10:46 UTC 2025
Kind:
1 Text note
Tags:
nonce 122539 18
3 thousand hashes per byte
im often forced to code react because everyone else uses it. i've never seen anything as ridiculous as incrementing a value like its done in react: const [inc, setInc] = useState(0); useEffect(() => { const ti = setInterval(() => { setInc(prevState => prevState + 1); }, 3000) return () => { clearInterval(ti); } }, []); useEffect(() => { console.log(inc); }, [inc]);
Created at:
Wed Mar 12 04:26:11 UTC 2025
Kind:
1 Text note
Tags:
nonce 432873 18
3 thousand hashes per byte
https://x0.at/8Lm9.png
Created at:
Sun Mar 30 00:25:29 UTC 2025
Kind:
1 Text note
Tags:
client getwired.app
nonce 696844 20
2 thousand hashes per byte
#taxstr ๐Ÿ’ฑ๐Ÿ“ˆโ›๏ธ https://image.nostr.build/476f3dfbb7f988f68eff1333f20cc888415f00a912417238168a17f7ea3acd86.png
Created at:
Mon Mar 10 22:50:07 UTC 2025
Kind:
1 Text note
Tags:
t taxstr
nonce 553351 18
2 thousand hashes per byte
querying randomx pow notes: nostril-query -g w 4,8,12 | websocat wss://nostr.data.haus | jq -c actual proof is available from r-tag.
Created at:
Thu Mar 13 22:07:38 UTC 2025
Kind:
1 Text note
Tags:
nonce 410622 18
2 thousand hashes per byte
so you say we should implement "trust score" systems on relays, like nostr.band already does. well, go on, but this is in reality just censorship and it will destroy nostr.
Created at:
Thu Mar 13 19:12:05 UTC 2025
Kind:
1 Text note
Tags:
nonce 427382 18
2 thousand hashes per byte
true anarchists transmit their photos in bmp however.
Created at:
Sun Mar 23 01:23:20 UTC 2025
Kind:
1 Text note
Tags:
nonce 1117319 18
p 32254c79c89036eba9a13d9d8a599ca71b0b4db4963fae5f46beeca2f25310a4
e 00001c333670cdce5a4e5a80fff638cb6eb2cfda3bc7dafbbc9df43bc393611e
nonce 527163 18
2 thousand hashes per byte
OK, let's try this: https://23gmt.nostr.technology/ A relay that only accepts kind:1 root notes only between 23GMT and 24GMT (also known as: right now) every day and deletes everything one hour after closing every day. Also only "protected" notes are accepted so they don't leak.
Created at:
Fri Mar 14 23:16:27 UTC 2025
Kind:
1 Text note
Tags:
nonce 13835058055282185013 16
1 thousand hashes per byte
here is implementation of nostr data stream: // r="wss://nostr.data.haus" function send(){ key=`nostril --kind 1 2>&1 | grep -Pom1 "\S{64}"` eid="" messages=("this" "is" "a" "simple" "example" "of" "recursive" "data" "stream") for i in {8..0} do event=`nostril --envelope --sec $key $([ ${#eid} -gt 0 ] && echo "-e $eid") --kind 3434 --content "${messages[$i]}"` eid=`echo "$event" | jq -r .[1].id` echo "$event" | websocat -n1 "$r" done echo "event id for streaming: $eid" } function download(){ [ ${#1} -ne 64 ] && { echo "event id required" return } eid=$1 while true do event=`nostril-query -i $eid | websocat -n1 $r` echo "received packet: "`echo "$event" | jq -r .[2].content` eid=`echo "$event" | jq -r '.[2].tags[]|select(.[0]=="e")[1]'` [ ${#eid} -eq 0 ] && break done } [ "$1" == "send" ] && { send; exit; } [ "$1" == "dl" ] && { download "$2"; exit; } echo "Usage: " echo "./recursion.sh send" echo "./recursion.sh dl <id>" // output: ./recursion.sh send ["OK","71804acbd113cbc92db3dfa61ae4837d6b5a7d1f50367f4a01193145d25eeba3",true,""] ["OK","f0f69313a91eefd2f667131004944dcb9b15fa72ae1ff1d5f15ffd947ce92ebb",true,""] ["OK","0bcbc9033a1010a102af8d1a2c002429458ba1737aed70abeb191c562209f3db",true,""] ["OK","3b1f5e3f52a4850c5e5eb08f359614614ad95a9d9cf47747883d9358f0b16eb4",true,""] ["OK","03f5a1aab60ff2aeaac8ed71a78bb4032ff5c893422bb43b5c94da3a21213a8c",true,""] ["OK","5c5200cba9c0deed8a2a2b3ca7f2845cf6aee98b41dc1a0f9c76371be87badfd",true,""] ["OK","aa04b39e54915d74c1439c098f9b9c30122faf1a0947fafe12e7df799096bdfb",true,""] ["OK","ccb2564ae69cd640b2269f840d7868a004db2b1eac42db634f1434d15b199108",true,""] ["OK","aad5b4135a42858cba6fdd31118d0f5e5781f2aba571ce90a429482d656a47d7",true,""] event id for streaming: aad5b4135a42858cba6fdd31118d0f5e5781f2aba571ce90a429482d656a47d7 ./recursion.sh dl aad5b4135a42858cba6fdd31118d0f5e5781f2aba571ce90a429482d656a47d7 received packet: this received packet: is received packet: a received packet: simple received packet: example received packet: of received packet: recursive received packet: data received packet: stream // in real scenario, each word can be replaced with up to 50 kB base64 data.
Created at:
Sun Mar 9 02:12:59 UTC 2025
Kind:
1 Text note
Tags:
nonce 30367 18
1 thousand hashes per byte
i think bitwise search could be pretty fast. how? tokenize data. each token is given a bit position. if token exists in data bit is 1, otherwise its 0. now find relevant rows in data with given input tokens. for example search could be: token1, token455, token664 create bitmask from the input data and apply to search data rows to find relevant results.
Created at:
Sat Mar 22 12:14:57 UTC 2025
Kind:
1 Text note
Tags:
nonce 284326 18
< prev 5 next >