Update#
Found this article and source code repository that converts Telegram Channel to a web microblog after being reminded by a netizen's comment.
https://chi.miantiao.me/posts/broadcast-channel/
You can configure it according to the instructions in the GitHub repository, but I found that only Cloudflare Pages can be deployed normally; Vercel encountered a runtime environment error. It has many environment variable configurations, but for minimal configuration, just set the channel username.
I previously created a simple webpage using AI, which fetched Telegram Channel content via RSS and displayed it item by item, but this project is more mature, easier to deploy, and has a better interface.
Below is my tinkering process, which is not very useful; if you just want to display a Telegram channel as a web microblog, deploying with the above project is sufficient.
👇 Original Text Below#
I am currently using Planet as a blog, which can manage multiple sites, so I created another one for daily notes. The only drawback is that the Planet iOS app can only connect to the computer for publishing via API, so even if I can publish content from my phone, the computer still needs to be on.
macOS has a feature called "Wake for network access," which allows the computer to use the network while in sleep mode. However, I found in practice that this only connects probabilistically, while setting the computer to not sleep allows for a stable connection. I use Cloudflare Tunnel for external access, but it’s still a bit of a hassle.
A few days ago, I discovered a project that publishes Telegram channels as web pages, but when I created the channel yesterday, I couldn't find it again; I forgot where I saw it. I also found that there is an xSync service that works with xLog, which can sync content from some platforms to the crossbell chain. I tried it with Jike, but during the authentication process, I needed to change my signature, which Jike does not allow. So, I bound my Telegram channel, which requires the channel to be public. However, after binding, I was unable to sync, neither manually nor automatically, and I don't know the reason.
So I asked ChatGPT for suggestions, and it said I could use RSShub + rss2json to achieve this. I had it write an HTML, which works but is not very user-friendly. First, the official RSShub instance has a cache time of 1 hour, meaning new content can only be synced after 1 hour, and rss2json has a cross-origin request issue. In the end, I had the AI modify it a few times, removing rss2json and directly reading the RSS content. To modify the cache time, I set up a personal RSShub with a cache time of 300 seconds. This worked perfectly. The final code is here: https://github.com/urkbio/rss2web
I initially wanted to host it directly on GitHub Pages, but I found that Planet has a publishing folder feature, so I tried it, and it worked. This way, the original version can be kept on the local computer, and modifications are convenient.
I also remembered 4everland, so I added the generated IPNS to 4ever for deployment and bound a domain name, which can now be accessed: https://chnl.zyg.im. Since this automatically fetches data from the RSS source with each visit, the HTML code will not change, so once deployed, it can be used without being affected by 4ever's update delays.
Since I used IPFS, it can also bind to a web3 domain. I added a subdomain to my sol domain and added the ipns://
content to the record, allowing access via the public gateway sol.build
.
https://chnl.joomaen.sol.build
The public gateway cannot resolve subdomains by default; I had previously inquired about this in a comment on v2ex, and Livid helped me add it manually.