forked from midou/invidious
Compare commits
38 Commits
Author | SHA1 | Date | |
---|---|---|---|
57d88ffcc8 | |||
e46e6183ae | |||
b49623f90f | |||
95c6747a3e | |||
245d0b571f | |||
6e0df50a03 | |||
f88697541c | |||
5eefab62fd | |||
13b0526c7a | |||
1568a35cfb | |||
93082c0a45 | |||
1a39faee75 | |||
81b447782a | |||
c87aa8671c | |||
921c34aa65 | |||
ccc423f682 | |||
02335f3390 | |||
bcc8ba73bf | |||
35e63fa3f5 | |||
3fe4547f8e | |||
2dbe151ceb | |||
e2c15468e0 | |||
022427e20e | |||
88430a6fc0 | |||
c72b9bea64 | |||
80bc29f3cd | |||
f7125c1204 | |||
6f9056fd84 | |||
3733fe8272 | |||
98bb20abcd | |||
a4d44d3286 | |||
dc358fc7e5 | |||
e14f2f2750 | |||
650b44ade2 | |||
3830604e42 | |||
f83e9e6eb9 | |||
236358d3ad | |||
43d6b65b4f |
120
CHANGELOG.md
120
CHANGELOG.md
@ -1,10 +1,40 @@
|
||||
# 0.7.0 (2018-09-25)
|
||||
|
||||
## Week 7: 1080p and Search Types
|
||||
|
||||
# 0.9.0 (2018-10-08)
|
||||
|
||||
## Week 9: Playlists
|
||||
|
||||
Not as much to announce this week, but I'm still quite happy to announce a couple things, namely:
|
||||
|
||||
Playback support for playlists has finally been added with [`88430a6`](https://github.com/omarroth/invidious/88430a6). You can now view playlists with the `&list=` query param, as you would on YouTube. You can also view mixes with the mentioned `&list=`, although they require some extra handling that I would like to add in the coming week, as well as adding playlist looping and shuffle. I think playback support has been a roadblock for more exciting features such as [#114](https://github.com/omarroth/invidious/issues/114), and I look forward to improving the experience.
|
||||
|
||||
Comments have had a bit of a cosmetic upgrade with [#132](https://github.com/omarroth/invidious/issues/132), which I think helps better distinguish between Reddit and YouTube comments, as it makes them appear similarly to their respective sites. You can also now switch between YouTube and Reddit comments with a push of a button, which I think is quite an improvement, especially for newer or less popular videos with fewer comments.
|
||||
|
||||
I've had a small breakthrough in speeding up users' subscription feeds with PostgreSQL's [materialized views](https://www.postgresql.org/docs/current/static/rules-materializedviews.html). Without going into too much detail, materialized views essentially cache the result of a query, making it possible to run resource-intensive queries once, rather than every time a user visits their feed. In the coming week I hope to push this out to users, and hopefully close [#173](https://github.com/omarroth/invidious/issues/173).
|
||||
|
||||
I haven't had as much time to work on the project this week, but I'm quite happy to have added some new features. Have a great week everyone.
|
||||
|
||||
# 0.8.0 (2018-10-02)
|
||||
|
||||
## Week 8: Mixes
|
||||
|
||||
Hello again!
|
||||
|
||||
Mixes have been added with [`20130db`](https://github.com/omarroth/invidious/20130db), which makes it easy to create a playlist of related content. See [#188](https://github.com/omarroth/invidious/issues/188) for more info on how they work. Currently, they return the first 50 videos rather than a continuous feed to avoid tracking by Google/YouTube, which I think is a good trade-off between usability and privacy, and I hope other folks agree. You can create mixes by adding `RD` to the beginning of a video ID, an example is provided [here](https://www.invidio.us/mix?list=RDYE7VzlLtp-4) based on Big Buck Bunny. I've been quite happy with the results returned for the mixes I've tried, and it is not limited to music, which I think is a big plus. To emulate a continuous feed provided many are used to, using the last video of each mix as a new 'seed' has worked well for me. In the coming week I'd like to to add playback support in the player to listen to these easily.
|
||||
|
||||
A very big thanks to [**@flourgaz**](https://github.com/flourgaz) for Docker support with [#186](https://github.com/omarroth/invidious/pull/186). This is an enormous improvement in portability for the project, and opens the door for Heroku support (see [#162](https://github.com/omarroth/invidious/issues/162)), and seamless support on Windows. For most users, it should be as easy as running `docker-compose up`.
|
||||
|
||||
I've spent quite a bit of time this past week improving support for geo-bypass (see [#92](https://github.com/omarroth/invidious/issues/92)), and am happy to note that Invidious has been able to proxy ~50% of the geo-restricted videos I've tried. In addition, you can now watch geo-restricted videos if you have `dash` enabled as your `preferred quality`, for more details see [#34](https://github.com/omarroth/invidious/issues/34) and [#185](https://github.com/omarroth/invidious/issues/185), or last week's update. For folks interested in replicating these results for themselves, I'd take a look [here](https://gist.github.com/omarroth/3ce0f276c43e0c4b13e7d9cd35524688) for the script used, and [here](https://gist.github.com/omarroth/beffc4a76a7b82a422e1b36a571878ef) for a list of videos restricted in the US.
|
||||
|
||||
1080p has seen a fairly smooth roll-out, although there have been a couple issues reported, mainly [#193](https://github.com/omarroth/invidious/issues/193), which is likely an issue in the player. I've also encountered a couple other issues myself that I would like to investigate. Although none are major, I'd like to keep 1080p opt-in for registered users another week to better address these issues.
|
||||
|
||||
Have an excellent week everyone.
|
||||
|
||||
# 0.7.0 (2018-09-25)
|
||||
|
||||
## Week 7: 1080p and Search Types
|
||||
|
||||
Hello again everyone! I've got quite a couple announcements this week:
|
||||
|
||||
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392)2a9073b4abb0d7fde58a3e6098668f53e, and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
||||
Experimental 1080p support has been added with [`b3ca392`](https://github.com/omarroth/invidious/b3ca392), and can be enabled by going to preferences and changing `preferred video quality` to `dash`. You can find more details [here](https://github.com/omarroth/invidious/issues/34#issuecomment-424171888). Currently quality and speed controls have not yet been integrated into the player, but I'd still appreciate feedback, mainly on any issues with buffering or DASH playback. I hope to integrate 1080p support into the player and push support site-wide in the coming weeks.
|
||||
|
||||
You can now filter content types in search with the `type:TYPE` filter. Supported content types are `playlist`, `channel`, and `video`. More info is available [here](https://github.com/omarroth/invidious/issues/126#issuecomment-423823148). I think this is quite an improvement in usability and I hope others find the same.
|
||||
|
||||
@ -13,16 +43,16 @@ A [CHANGELOG](https://github.com/omarroth/invidious/blob/master/CHANGELOG.md) ha
|
||||
Recently, users have been reporting 504s when attempting to access their subscriptions, which is tracked in [#173](https://github.com/omarroth/invidious/issues/173). This is most likely caused by an uptick in usage, which I am absolutely grateful for, but unfortunately has resulted in an increase in costs for hosting the site, which is why I will be bumping my goal on Patreon from $60 to $80. I would appreciate any feedback on how subscriptions could be improved.
|
||||
|
||||
Other minor improvements include:
|
||||
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)41d9d67b6bddd8a9836c1b71c124c3614
|
||||
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)13320a970e3a87a26249c2a18a709f020
|
||||
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)6d29eccc3e3adf02be138fddec2354027
|
||||
- Additional regions added to bypass geo-block with [`9a78523`](https://github.com/omarroth/invidious/9a78523)
|
||||
- Fix for playlists containing less than 100 videos (previously shown as empty) with [`35ac887`](https://github.com/omarroth/invidious/35ac887)
|
||||
- Fix for `published` date for Reddit comments (previously showing negative seconds) with [`6e09202`](https://github.com/omarroth/invidious/6e09202)
|
||||
|
||||
Thank you everyone for your support!
|
||||
|
||||
# 0.6.0 (2018-09-18)
|
||||
|
||||
## Week 6: Filters and Thumbnails
|
||||
|
||||
Thank you everyone for your support!
|
||||
|
||||
# 0.6.0 (2018-09-18)
|
||||
|
||||
## Week 6: Filters and Thumbnails
|
||||
|
||||
Hello again! This week I'm happy to mention a couple new features to search as well as some miscellaneous usability improvements.
|
||||
|
||||
You can now constrain your search query to a specific channel with the `channel:CHANNEL` filter (see [#165](https://github.com/omarroth/invidious/issues/165) for more details). Unfortunately, other search filters combined with channel search are not yet supported. I hope to add support for them in the coming weeks.
|
||||
@ -35,12 +65,12 @@ As a smaller improvement to the site, you can also now view RSS feeds for playli
|
||||
|
||||
These updates are also now listed under Github's [releases](https://github.com/omarroth/invidious/releases). I'm also planning on adding them as a `CHANGELOG.md` in the repository itself so people can receive a copy with the project's source.
|
||||
|
||||
That's all for this week. Thank you everyone for your support!
|
||||
|
||||
# 0.5.0 (2018-09-11)
|
||||
|
||||
## Week 5: Privacy and Security
|
||||
|
||||
That's all for this week. Thank you everyone for your support!
|
||||
|
||||
# 0.5.0 (2018-09-11)
|
||||
|
||||
## Week 5: Privacy and Security
|
||||
|
||||
I hope everyone had a good weekend! This past week I've been fixing some issues that have been brought to my attention to help better protect users and help them keep their anonymity.
|
||||
|
||||
An issue with open referers has been fixed with [`29a2186`](https://github.com/omarroth/invidious/29a2186), which prevents potential redirects to external sites on actions such as login or modifying preferences.
|
||||
@ -67,12 +97,12 @@ Folks have also probably noticed that the gutters on either side of the screen h
|
||||
"Music", "Sports", and "Popular on YouTube" channels now properly display their videos. You can subscribe to these channels just as you would normally.
|
||||
|
||||
This coming week I'm planning on spending time with my family, so I unfortunately may not be as responsive. I do still hope to add some smaller features for next week however, and I hope to continue development soon.
|
||||
Thank you everyone again for your support.
|
||||
|
||||
# 0.4.0 (2018-09-06)
|
||||
|
||||
## Week 4: Genre Channels
|
||||
|
||||
Thank you everyone again for your support.
|
||||
|
||||
# 0.4.0 (2018-09-06)
|
||||
|
||||
## Week 4: Genre Channels
|
||||
|
||||
Hello! I hope everyone enjoyed their weekend. Without further ado:
|
||||
Just today genre channels have been added with [#119](https://github.com/omarroth/invidious/issues/119). More information on genre channels is available [here](https://support.google.com/youtube/answer/2579942). You can subscribe to them as normally, and view them as RSS. I think they offer an interesting alternative way to find new content and I hope people find them useful.
|
||||
|
||||
@ -84,12 +114,12 @@ One of the major use cases for Invidious is as a stripped-down version of YouTub
|
||||
|
||||
Finally, I'm pleased to announce that Invidious has hit 100 stars on GitHub. I am very happy that Invidious has proven to be useful to so many people, and I can't say how grateful I am to everyone for their continued support.
|
||||
|
||||
Enjoy the rest of your week everyone!
|
||||
|
||||
# 0.3.0 (2018-09-06)
|
||||
|
||||
## Week 3: Quality of Life
|
||||
|
||||
Enjoy the rest of your week everyone!
|
||||
|
||||
# 0.3.0 (2018-09-06)
|
||||
|
||||
## Week 3: Quality of Life
|
||||
|
||||
Hello everyone! This week I've been working on some smaller features that will hopefully make the site more functional.
|
||||
Search filters have been added with [#126](https://github.com/omarroth/invidious/issues/126). You can now specify 'sort', 'date', 'duration', and 'features' within your query using the 'operator:value' syntax. I'd recommend taking a look [here](https://github.com/omarroth/invidious/blob/master/src/invidious/search.cr#L33-L114) for a list of supported options and at [#126](https://github.com/omarroth/invidious/issues/126) for some examples. This also opens the door for features such as [#30](https://github.com/omarroth/invidious/issues/30) which can be implemented as filters. I think advanced search is a major point in which Invidious can improve on YouTube and hope to add more features soon!
|
||||
|
||||
@ -101,12 +131,12 @@ I'd also like to announce that I've set up an account on [Liberapay](https://lib
|
||||
|
||||
[Two weeks ago](https://github.com/omarroth/invidious/releases/tag/0.1.0) I mentioned adding 1080p support into the player. Currently, the only thing blocking is [#207](https://github.com/videojs/http-streaming/pull/207) in the excellent [http-streaming](https://github.com/videojs/http-streaming) library. I hope to work with the videojs team to merge it soon and finally implement 1080p support!
|
||||
|
||||
That's all for this week, thank you again everyone for your support!
|
||||
|
||||
# 0.2.0 (2018-09-06)
|
||||
|
||||
## Week 2: Toward Playlists
|
||||
|
||||
That's all for this week, thank you again everyone for your support!
|
||||
|
||||
# 0.2.0 (2018-09-06)
|
||||
|
||||
## Week 2: Toward Playlists
|
||||
|
||||
Sorry for the late update! Not as much to announce this week, but still a couple things of note:
|
||||
I'm happy to announce that a playlists page and API endpoint has been added so you can now view playlists. Currently, you cannot watch playlists through the player, but I hope to add that in the coming week as well as adding functionality to add and modify playlists. There is a good conversation on [#114](https://github.com/omarroth/invidious/issues/114) about giving playlists even more functionality, which I think is interesting and would appreciate feedback on.
|
||||
|
||||
@ -120,12 +150,12 @@ A couple of miscellaneous features and bugfixes:
|
||||
|
||||
- Changed YouTube comment header to "View x comments" - [#120](https://github.com/omarroth/invidious/issues/120)
|
||||
|
||||
Enjoy your week everyone!
|
||||
|
||||
# 0.1.0 (2018-09-06)
|
||||
|
||||
## Week 1: Invidious API and Geo-Bypass
|
||||
|
||||
Enjoy your week everyone!
|
||||
|
||||
# 0.1.0 (2018-09-06)
|
||||
|
||||
## Week 1: Invidious API and Geo-Bypass
|
||||
|
||||
Hello everyone! This past week there have been quite a few things worthy of mention:
|
||||
|
||||
I'm happy to announce the [Invidious Developer API](https://github.com/omarroth/invidious/wiki/API). The Invidious API does not use any of the official YouTube APIs, and instead crawls the site to provide a JSON interface for other developers to use. It's still under development but is already powering [CloudTube](https://github.com/cloudrac3r/cadencegq). The API currently does not have a quota (compared to YouTube) which I hope to continue thanks to continued support from my Patrons. Hopefully other developers find it useful, and I hope to continue to improve it so it can better serve the community.
|
||||
@ -134,4 +164,4 @@ Just today partial support for bypassing geo-restrictions has been added with [f
|
||||
|
||||
Support for generating DASH manifests has been fixed, in the coming week I hope to integrate this functionality into the watch page, so users can view videos in 1080p and above.
|
||||
|
||||
Thank you everyone for your continued interest and support!
|
||||
Thank you everyone for your continued interest and support!
|
||||
|
@ -17,6 +17,15 @@ div {
|
||||
animation: spin 2s linear infinite;
|
||||
}
|
||||
|
||||
.playlist-restricted {
|
||||
height: 20em;
|
||||
padding-right: 10px;
|
||||
}
|
||||
|
||||
.pure-button-primary {
|
||||
background: rgba(0, 182, 240, 1);
|
||||
}
|
||||
|
||||
/*
|
||||
* Navbar
|
||||
*/
|
||||
|
59
assets/js/watch.js
Normal file
59
assets/js/watch.js
Normal file
@ -0,0 +1,59 @@
|
||||
function toggle_parent(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
}
|
||||
|
||||
function toggle_comments(target) {
|
||||
body = target.parentNode.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
}
|
||||
|
||||
function swap_comments(source) {
|
||||
if (source == "youtube") {
|
||||
get_youtube_comments();
|
||||
} else if (source == "reddit") {
|
||||
get_reddit_comments();
|
||||
}
|
||||
}
|
||||
|
||||
function commaSeparateNumber(val) {
|
||||
while (/(\d+)(\d{3})/.test(val.toString())) {
|
||||
val = val.toString().replace(/(\d+)(\d{3})/, "$1" + "," + "$2");
|
||||
}
|
||||
return val;
|
||||
}
|
||||
|
||||
String.prototype.supplant = function(o) {
|
||||
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
||||
var r = o[b];
|
||||
return typeof r === "string" || typeof r === "number" ? r : a;
|
||||
});
|
||||
};
|
||||
|
||||
function show_youtube_replies(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
body.style.display = "";
|
||||
|
||||
target.innerHTML = "Hide replies";
|
||||
target.setAttribute("onclick", "hide_youtube_replies(this)");
|
||||
}
|
||||
|
||||
function hide_youtube_replies(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
body.style.display = "none";
|
||||
|
||||
target.innerHTML = "Show replies";
|
||||
target.setAttribute("onclick", "show_youtube_replies(this)");
|
||||
}
|
@ -1,5 +1,6 @@
|
||||
crawl_threads: 1
|
||||
channel_threads: 1
|
||||
feed_threads: 1
|
||||
video_threads: 1
|
||||
db:
|
||||
user: kemal
|
||||
@ -8,4 +9,5 @@ db:
|
||||
port: 5432
|
||||
dbname: invidious
|
||||
full_refresh: false
|
||||
https_only: false
|
||||
https_only: false
|
||||
geo_bypass: true
|
||||
|
@ -22,6 +22,8 @@ CREATE TABLE public.videos
|
||||
genre text COLLATE pg_catalog."default",
|
||||
genre_url text COLLATE pg_catalog."default",
|
||||
license text COLLATE pg_catalog."default",
|
||||
sub_count_text text COLLATE pg_catalog."default",
|
||||
author_thumbnail text COLLATE pg_catalog."default",
|
||||
CONSTRAINT videos_pkey PRIMARY KEY (id)
|
||||
)
|
||||
WITH (
|
||||
|
@ -1,5 +1,5 @@
|
||||
name: invidious
|
||||
version: 0.7.0
|
||||
version: 0.9.0
|
||||
|
||||
authors:
|
||||
- Omar Roth <omarroth@hotmail.com>
|
||||
|
330
src/invidious.cr
330
src/invidious.cr
@ -31,6 +31,7 @@ HMAC_KEY = CONFIG.hmac_key || Random::Secure.random_bytes(32)
|
||||
|
||||
crawl_threads = CONFIG.crawl_threads
|
||||
channel_threads = CONFIG.channel_threads
|
||||
feed_threads = CONFIG.feed_threads
|
||||
video_threads = CONFIG.video_threads
|
||||
|
||||
Kemal.config.extra_options do |parser|
|
||||
@ -51,6 +52,14 @@ Kemal.config.extra_options do |parser|
|
||||
exit
|
||||
end
|
||||
end
|
||||
parser.on("-f THREADS", "--feed-threads=THREADS", "Number of threads for refreshing feeds (default: #{feed_threads})") do |number|
|
||||
begin
|
||||
feed_threads = number.to_i
|
||||
rescue ex
|
||||
puts "THREADS must be integer"
|
||||
exit
|
||||
end
|
||||
end
|
||||
parser.on("-v THREADS", "--video-threads=THREADS", "Number of threads for refreshing videos (default: #{video_threads})") do |number|
|
||||
begin
|
||||
video_threads = number.to_i
|
||||
@ -85,6 +94,8 @@ end
|
||||
|
||||
refresh_channels(PG_DB, channel_threads, CONFIG.full_refresh)
|
||||
|
||||
refresh_feeds(PG_DB, feed_threads)
|
||||
|
||||
video_threads.times do |i|
|
||||
spawn do
|
||||
refresh_videos(PG_DB)
|
||||
@ -106,10 +117,12 @@ spawn do
|
||||
end
|
||||
|
||||
proxies = {} of String => Array({ip: String, port: Int32})
|
||||
spawn do
|
||||
find_working_proxies(BYPASS_REGIONS) do |region, list|
|
||||
if !list.empty?
|
||||
proxies[region] = list
|
||||
if CONFIG.geo_bypass
|
||||
spawn do
|
||||
find_working_proxies(BYPASS_REGIONS) do |region, list|
|
||||
if !list.empty?
|
||||
proxies[region] = list
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
@ -215,6 +228,8 @@ get "/watch" do |env|
|
||||
next env.redirect "/"
|
||||
end
|
||||
|
||||
plid = env.params.query["list"]?
|
||||
|
||||
user = env.get? "user"
|
||||
if user
|
||||
user = user.as(User)
|
||||
@ -235,6 +250,8 @@ get "/watch" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/watch?v=#{ex.message}"
|
||||
rescue ex
|
||||
error_message = ex.message
|
||||
STDOUT << id << " : " << ex.message << "\n"
|
||||
@ -335,6 +352,8 @@ get "/embed/:id" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/embed/#{ex.message}"
|
||||
rescue ex
|
||||
error_message = ex.message
|
||||
next templated "error"
|
||||
@ -400,6 +419,10 @@ get "/playlist" do |env|
|
||||
page = env.params.query["page"]?.try &.to_i?
|
||||
page ||= 1
|
||||
|
||||
if plid.starts_with? "RD"
|
||||
next env.redirect "/mix?list=#{plid}"
|
||||
end
|
||||
|
||||
begin
|
||||
playlist = fetch_playlist(plid)
|
||||
rescue ex
|
||||
@ -463,9 +486,8 @@ get "/search" do |env|
|
||||
user = env.get? "user"
|
||||
if user
|
||||
user = user.as(User)
|
||||
ucids = user.subscriptions
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
end
|
||||
ucids ||= [] of String
|
||||
|
||||
channel = nil
|
||||
content_type = "all"
|
||||
@ -502,14 +524,19 @@ get "/search" do |env|
|
||||
if channel
|
||||
count, videos = channel_search(search_query, page, channel)
|
||||
elsif subscriptions
|
||||
videos = PG_DB.query_all("SELECT id,title,published,updated,ucid,author FROM (
|
||||
if view_name
|
||||
videos = PG_DB.query_all("SELECT id,title,published,updated,ucid,author FROM (
|
||||
SELECT *,
|
||||
to_tsvector(channel_videos.title) ||
|
||||
to_tsvector(channel_videos.author)
|
||||
to_tsvector(#{view_name}.title) ||
|
||||
to_tsvector(#{view_name}.author)
|
||||
as document
|
||||
FROM channel_videos WHERE ucid IN (#{arg_array(ucids, 3)})
|
||||
) v_search WHERE v_search.document @@ plainto_tsquery($1) LIMIT 20 OFFSET $2;", [search_query, (page - 1) * 20] + ucids, as: ChannelVideo)
|
||||
count = videos.size
|
||||
FROM #{view_name}
|
||||
) v_search WHERE v_search.document @@ plainto_tsquery($1) LIMIT 20 OFFSET $2;", search_query, (page - 1) * 20, as: ChannelVideo)
|
||||
count = videos.size
|
||||
else
|
||||
videos = [] of ChannelVideo
|
||||
count = 0
|
||||
end
|
||||
else
|
||||
begin
|
||||
search_params = produce_search_params(sort: sort, date: date, content_type: content_type,
|
||||
@ -743,7 +770,7 @@ post "/login" do |env|
|
||||
end
|
||||
|
||||
if action == "signin"
|
||||
user = PG_DB.query_one?("SELECT * FROM users WHERE email = $1 AND password IS NOT NULL", email, as: User)
|
||||
user = PG_DB.query_one?("SELECT * FROM users WHERE LOWER(email) = LOWER($1) AND password IS NOT NULL", email, as: User)
|
||||
|
||||
if !user
|
||||
error_message = "Invalid username or password"
|
||||
@ -757,7 +784,7 @@ post "/login" do |env|
|
||||
|
||||
if Crypto::Bcrypt::Password.new(user.password.not_nil!) == password
|
||||
sid = Base64.urlsafe_encode(Random::Secure.random_bytes(32))
|
||||
PG_DB.exec("UPDATE users SET id = id || $1 WHERE email = $2", [sid], email)
|
||||
PG_DB.exec("UPDATE users SET id = id || $1 WHERE LOWER(email) = LOWER($2)", [sid], email)
|
||||
|
||||
if Kemal.config.ssl || CONFIG.https_only
|
||||
secure = true
|
||||
@ -772,7 +799,7 @@ post "/login" do |env|
|
||||
next templated "error"
|
||||
end
|
||||
elsif action == "register"
|
||||
user = PG_DB.query_one?("SELECT * FROM users WHERE email = $1 AND password IS NOT NULL", email, as: User)
|
||||
user = PG_DB.query_one?("SELECT * FROM users WHERE LOWER(email) = LOWER($1) AND password IS NOT NULL", email, as: User)
|
||||
if user
|
||||
error_message = "Please sign in"
|
||||
next templated "error"
|
||||
@ -787,6 +814,12 @@ post "/login" do |env|
|
||||
|
||||
PG_DB.exec("INSERT INTO users VALUES (#{args})", user_array)
|
||||
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||
SELECT * FROM channel_videos WHERE \
|
||||
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||
ORDER BY published DESC;")
|
||||
|
||||
if Kemal.config.ssl || CONFIG.https_only
|
||||
secure = true
|
||||
else
|
||||
@ -1113,12 +1146,14 @@ post "/data_control" do |env|
|
||||
body = JSON.parse(body)
|
||||
body["subscriptions"].as_a.each do |ucid|
|
||||
ucid = ucid.as_s
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE id = $2", ucid, user.id)
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1127,8 +1162,10 @@ post "/data_control" do |env|
|
||||
|
||||
body["watch_history"].as_a.each do |id|
|
||||
id = id.as_s
|
||||
|
||||
if !user.watched.includes? id
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||
user.watched << id
|
||||
end
|
||||
end
|
||||
|
||||
@ -1139,11 +1176,12 @@ post "/data_control" do |env|
|
||||
ucid = channel["xmlUrl"].match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1154,11 +1192,12 @@ post "/data_control" do |env|
|
||||
ucid = md["channel_id"]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1170,11 +1209,12 @@ post "/data_control" do |env|
|
||||
ucid = channel["url"].as_s.match(/UC[a-zA-Z0-9_-]{22}/).not_nil![0]
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1190,19 +1230,24 @@ post "/data_control" do |env|
|
||||
|
||||
db = entry.io.gets_to_end
|
||||
db.scan(/youtube\.com\/watch\?v\=(?<id>[a-zA-Z0-9_-]{11})/) do |md|
|
||||
if !user.watched.includes? md["id"]
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", md["id"], user.email)
|
||||
id = md["id"]
|
||||
|
||||
if !user.watched.includes? id
|
||||
PG_DB.exec("UPDATE users SET watched = array_append(watched,$1) WHERE email = $2", id, user.email)
|
||||
user.watched << id
|
||||
end
|
||||
end
|
||||
|
||||
db.scan(/youtube\.com\/channel\/(?<ucid>[a-zA-Z0-9_-]{22})/) do |md|
|
||||
ucid = md["ucid"]
|
||||
if !user.subscriptions.includes? ucid
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
|
||||
if !user.subscriptions.includes? ucid
|
||||
begin
|
||||
client = make_client(YT_URL)
|
||||
get_channel(ucid, client, PG_DB, false, false)
|
||||
|
||||
PG_DB.exec("UPDATE users SET subscriptions = array_append(subscriptions,$1) WHERE email = $2", ucid, user.email)
|
||||
user.subscriptions << ucid
|
||||
rescue ex
|
||||
next
|
||||
end
|
||||
@ -1340,6 +1385,8 @@ get "/feed/subscriptions" do |env|
|
||||
|
||||
notifications = PG_DB.query_one("SELECT notifications FROM users WHERE email = $1", user.email,
|
||||
as: Array(String))
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
|
||||
if preferences.notifications_only && !notifications.empty?
|
||||
args = arg_array(notifications)
|
||||
|
||||
@ -1362,39 +1409,35 @@ get "/feed/subscriptions" do |env|
|
||||
else
|
||||
if preferences.latest_only
|
||||
if preferences.unseen_only
|
||||
ucids = arg_array(user.subscriptions)
|
||||
if user.watched.empty?
|
||||
watched = "'{}'"
|
||||
else
|
||||
watched = arg_array(user.watched, user.subscriptions.size + 1)
|
||||
watched = arg_array(user.watched)
|
||||
end
|
||||
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
||||
ucid IN (#{ucids}) AND id NOT IN (#{watched}) ORDER BY ucid, published DESC",
|
||||
user.subscriptions + user.watched, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} WHERE \
|
||||
id NOT IN (#{watched}) ORDER BY ucid, published DESC",
|
||||
user.watched, as: ChannelVideo)
|
||||
else
|
||||
args = arg_array(user.subscriptions)
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
||||
ucid IN (#{args}) ORDER BY ucid, published DESC", user.subscriptions, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} \
|
||||
ORDER BY ucid, published DESC", as: ChannelVideo)
|
||||
end
|
||||
|
||||
videos.sort_by! { |video| video.published }.reverse!
|
||||
else
|
||||
if preferences.unseen_only
|
||||
ucids = arg_array(user.subscriptions, 3)
|
||||
if user.watched.empty?
|
||||
watched = "'{}'"
|
||||
else
|
||||
watched = arg_array(user.watched, user.subscriptions.size + 3)
|
||||
watched = arg_array(user.watched, 3)
|
||||
end
|
||||
|
||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{ucids}) \
|
||||
AND id NOT IN (#{watched}) ORDER BY published DESC LIMIT $1 OFFSET $2",
|
||||
[limit, offset] + user.subscriptions + user.watched, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT * FROM #{view_name} WHERE \
|
||||
id NOT IN (#{watched}) LIMIT $1 OFFSET $2",
|
||||
[limit, offset] + user.watched, as: ChannelVideo)
|
||||
else
|
||||
args = arg_array(user.subscriptions, 3)
|
||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{args}) \
|
||||
ORDER BY published DESC LIMIT $1 OFFSET $2", [limit, offset] + user.subscriptions, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT * FROM #{view_name} \
|
||||
ORDER BY published DESC LIMIT $1 OFFSET $2", limit, offset, as: ChannelVideo)
|
||||
end
|
||||
end
|
||||
|
||||
@ -1443,29 +1486,8 @@ get "/feed/channel/:ucid" do |env|
|
||||
halt env, status_code: 404, response: error_message
|
||||
end
|
||||
|
||||
client = make_client(YT_URL)
|
||||
|
||||
page = 1
|
||||
|
||||
videos = [] of SearchVideo
|
||||
2.times do |i|
|
||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||
response = client.get(url)
|
||||
json = JSON.parse(response.body)
|
||||
|
||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||
document = XML.parse_html(json["content_html"].as_s)
|
||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||
|
||||
if auto_generated
|
||||
videos += extract_videos(nodeset)
|
||||
else
|
||||
videos += extract_videos(nodeset, ucid)
|
||||
end
|
||||
else
|
||||
break
|
||||
end
|
||||
end
|
||||
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||
|
||||
host_url = make_host_url(Kemal.config.ssl || CONFIG.https_only, env.request.headers["Host"]?)
|
||||
path = env.request.path
|
||||
@ -1552,15 +1574,14 @@ get "/feed/private" do |env|
|
||||
latest_only ||= 0
|
||||
latest_only = latest_only == 1
|
||||
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
|
||||
if latest_only
|
||||
args = arg_array(user.subscriptions)
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM channel_videos WHERE \
|
||||
ucid IN (#{args}) ORDER BY ucid, published DESC", user.subscriptions, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT DISTINCT ON (ucid) * FROM #{view_name} ORDER BY ucid, published DESC", as: ChannelVideo)
|
||||
videos.sort_by! { |video| video.published }.reverse!
|
||||
else
|
||||
args = arg_array(user.subscriptions, 3)
|
||||
videos = PG_DB.query_all("SELECT * FROM channel_videos WHERE ucid IN (#{args}) \
|
||||
ORDER BY published DESC LIMIT $1 OFFSET $2", [limit, offset] + user.subscriptions, as: ChannelVideo)
|
||||
videos = PG_DB.query_all("SELECT * FROM #{view_name} \
|
||||
ORDER BY published DESC LIMIT $1 OFFSET $2", limit, offset, as: ChannelVideo)
|
||||
end
|
||||
|
||||
sort = env.params.query["sort"]?
|
||||
@ -1697,7 +1718,7 @@ get "/channel/:ucid" do |env|
|
||||
page ||= 1
|
||||
|
||||
begin
|
||||
author, ucid, auto_generated = get_about_info(ucid)
|
||||
author, ucid, auto_generated, sub_count = get_about_info(ucid)
|
||||
rescue ex
|
||||
error_message = "User does not exist"
|
||||
next templated "error"
|
||||
@ -1711,27 +1732,7 @@ get "/channel/:ucid" do |env|
|
||||
end
|
||||
end
|
||||
|
||||
client = make_client(YT_URL)
|
||||
|
||||
videos = [] of SearchVideo
|
||||
2.times do |i|
|
||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||
response = client.get(url)
|
||||
json = JSON.parse(response.body)
|
||||
|
||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||
document = XML.parse_html(json["content_html"].as_s)
|
||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||
|
||||
if auto_generated
|
||||
videos += extract_videos(nodeset)
|
||||
else
|
||||
videos += extract_videos(nodeset, ucid)
|
||||
end
|
||||
else
|
||||
break
|
||||
end
|
||||
end
|
||||
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||
|
||||
templated "channel"
|
||||
end
|
||||
@ -1759,6 +1760,8 @@ get "/api/v1/captions/:id" do |env|
|
||||
client = make_client(YT_URL)
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/v1/captions/#{ex.message}"
|
||||
rescue ex
|
||||
halt env, status_code: 403
|
||||
end
|
||||
@ -1874,31 +1877,34 @@ get "/api/v1/comments/:id" do |env|
|
||||
|
||||
proxies.each do |region, list|
|
||||
spawn do
|
||||
proxy_html = %(<meta itemprop="regionsAllowed" content="">)
|
||||
|
||||
list.each do |proxy|
|
||||
begin
|
||||
proxy_client = HTTPClient.new(YT_URL)
|
||||
proxy_client.read_timeout = 10.seconds
|
||||
proxy_client.connect_timeout = 10.seconds
|
||||
|
||||
proxy = list.sample(1)[0]
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
proxy_client.set_proxy(proxy)
|
||||
|
||||
proxy_html = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
response = proxy_client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
proxy_headers = HTTP::Headers.new
|
||||
proxy_headers["cookie"] = proxy_html.cookies.add_request_headers(headers)["cookie"]
|
||||
proxy_html = proxy_html.body
|
||||
proxy_headers["cookie"] = response.cookies.add_request_headers(headers)["cookie"]
|
||||
proxy_html = response.body
|
||||
|
||||
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||
bypass_channel.send(nil)
|
||||
else
|
||||
if !proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||
bypass_channel.send({proxy_html, proxy_client, proxy_headers})
|
||||
break
|
||||
end
|
||||
|
||||
break
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
|
||||
# If none of the proxies we tried returned a valid response
|
||||
if proxy_html.match(/<meta itemprop="regionsAllowed" content="">/)
|
||||
bypass_channel.send(nil)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@ -2203,6 +2209,8 @@ get "/api/v1/videos/:id" do |env|
|
||||
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/v1/videos/#{ex.message}"
|
||||
rescue ex
|
||||
error_message = {"error" => ex.message}.to_json
|
||||
halt env, status_code: 500, response: error_message
|
||||
@ -2246,6 +2254,22 @@ get "/api/v1/videos/:id" do |env|
|
||||
json.field "authorId", video.ucid
|
||||
json.field "authorUrl", "/channel/#{video.ucid}"
|
||||
|
||||
json.field "authorThumbnails" do
|
||||
json.array do
|
||||
qualities = [32, 48, 76, 100, 176, 512]
|
||||
|
||||
qualities.each do |quality|
|
||||
json.object do
|
||||
json.field "url", video.author_thumbnail.gsub("=s48-", "=s#{quality}-")
|
||||
json.field "width", quality
|
||||
json.field "height", quality
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
json.field "subCountText", video.sub_count_text
|
||||
|
||||
json.field "lengthSeconds", video.info["length_seconds"].to_i
|
||||
if video.info["allow_ratings"]?
|
||||
json.field "allowRatings", video.info["allow_ratings"] == "1"
|
||||
@ -2464,30 +2488,10 @@ get "/api/v1/channels/:ucid" do |env|
|
||||
halt env, status_code: 404, response: error_message
|
||||
end
|
||||
|
||||
client = make_client(YT_URL)
|
||||
|
||||
page = 1
|
||||
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||
|
||||
videos = [] of SearchVideo
|
||||
2.times do |i|
|
||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||
response = client.get(url)
|
||||
json = JSON.parse(response.body)
|
||||
|
||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||
document = XML.parse_html(json["content_html"].as_s)
|
||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||
|
||||
if auto_generated
|
||||
videos += extract_videos(nodeset)
|
||||
else
|
||||
videos += extract_videos(nodeset, ucid)
|
||||
end
|
||||
else
|
||||
break
|
||||
end
|
||||
end
|
||||
|
||||
client = make_client(YT_URL)
|
||||
channel_html = client.get("/channel/#{ucid}/about?disable_polymer=1").body
|
||||
channel_html = XML.parse_html(channel_html)
|
||||
banner = channel_html.xpath_node(%q(//div[@id="gh-banner"]/style)).not_nil!.content
|
||||
@ -2623,27 +2627,7 @@ end
|
||||
halt env, status_code: 404, response: error_message
|
||||
end
|
||||
|
||||
client = make_client(YT_URL)
|
||||
|
||||
videos = [] of SearchVideo
|
||||
2.times do |i|
|
||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||
response = client.get(url)
|
||||
json = JSON.parse(response.body)
|
||||
|
||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||
document = XML.parse_html(json["content_html"].as_s)
|
||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||
|
||||
if auto_generated
|
||||
videos += extract_videos(nodeset)
|
||||
else
|
||||
videos += extract_videos(nodeset, ucid)
|
||||
end
|
||||
else
|
||||
break
|
||||
end
|
||||
end
|
||||
videos, count = get_60_videos(ucid, page, auto_generated)
|
||||
|
||||
result = JSON.build do |json|
|
||||
json.array do
|
||||
@ -2906,6 +2890,15 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
page = env.params.query["page"]?.try &.to_i?
|
||||
page ||= 1
|
||||
|
||||
format = env.params.query["format"]?
|
||||
format ||= "json"
|
||||
|
||||
continuation = env.params.query["continuation"]?
|
||||
|
||||
if plid.starts_with? "RD"
|
||||
next env.redirect "/api/v1/mixes/#{plid}"
|
||||
end
|
||||
|
||||
begin
|
||||
playlist = fetch_playlist(plid)
|
||||
rescue ex
|
||||
@ -2914,7 +2907,7 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
end
|
||||
|
||||
begin
|
||||
videos = fetch_playlist_videos(plid, page, playlist.video_count)
|
||||
videos = fetch_playlist_videos(plid, page, playlist.video_count, continuation)
|
||||
rescue ex
|
||||
videos = [] of PlaylistVideo
|
||||
end
|
||||
@ -2973,6 +2966,17 @@ get "/api/v1/playlists/:plid" do |env|
|
||||
end
|
||||
end
|
||||
|
||||
if format == "html"
|
||||
response = JSON.parse(response)
|
||||
playlist_html = template_playlist(response)
|
||||
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||
|
||||
response = {
|
||||
"playlistHtml" => playlist_html,
|
||||
"nextVideo" => next_video,
|
||||
}.to_json
|
||||
end
|
||||
|
||||
response
|
||||
end
|
||||
|
||||
@ -2984,6 +2988,9 @@ get "/api/v1/mixes/:rdid" do |env|
|
||||
continuation = env.params.query["continuation"]?
|
||||
continuation ||= rdid.lchop("RD")
|
||||
|
||||
format = env.params.query["format"]?
|
||||
format ||= "json"
|
||||
|
||||
begin
|
||||
mix = fetch_mix(rdid, continuation)
|
||||
rescue ex
|
||||
@ -3022,6 +3029,17 @@ get "/api/v1/mixes/:rdid" do |env|
|
||||
end
|
||||
end
|
||||
|
||||
if format == "html"
|
||||
response = JSON.parse(response)
|
||||
playlist_html = template_mix(response)
|
||||
next_video = response["videos"].as_a[1]?.try &.["videoId"]
|
||||
|
||||
response = {
|
||||
"playlistHtml" => playlist_html,
|
||||
"nextVideo" => next_video,
|
||||
}.to_json
|
||||
end
|
||||
|
||||
response
|
||||
end
|
||||
|
||||
@ -3045,6 +3063,8 @@ get "/api/manifest/dash/id/:id" do |env|
|
||||
client = make_client(YT_URL)
|
||||
begin
|
||||
video = get_video(id, PG_DB, proxies)
|
||||
rescue ex : VideoRedirect
|
||||
next env.redirect "/api/manifest/dash/id/#{ex.message}"
|
||||
rescue ex
|
||||
halt env, status_code: 403
|
||||
end
|
||||
@ -3408,6 +3428,24 @@ get "/vi/:id/:name" do |env|
|
||||
end
|
||||
|
||||
error 404 do |env|
|
||||
if md = env.request.path.match(/^\/(?<id>[a-zA-Z0-9_-]{11})/)
|
||||
id = md["id"]
|
||||
|
||||
params = [] of String
|
||||
env.params.query.each do |k, v|
|
||||
params << "#{k}=#{v}"
|
||||
end
|
||||
params = params.join("&")
|
||||
|
||||
url = "/watch?v=#{id}"
|
||||
if !params.empty?
|
||||
url += "&#{params}"
|
||||
end
|
||||
|
||||
env.response.headers["Location"] = url
|
||||
halt env, status_code: 302
|
||||
end
|
||||
|
||||
error_message = "404 Page not found"
|
||||
templated "error"
|
||||
end
|
||||
|
@ -176,7 +176,7 @@ def produce_channel_videos_url(ucid, page = 1, auto_generated = nil)
|
||||
continuation = Base64.urlsafe_encode(continuation)
|
||||
continuation = URI.escape(continuation)
|
||||
|
||||
url = "/browse_ajax?continuation=#{continuation}"
|
||||
url = "/browse_ajax?continuation=#{continuation}&gl=US&hl=en"
|
||||
|
||||
return url
|
||||
end
|
||||
@ -196,6 +196,12 @@ def get_about_info(ucid)
|
||||
raise "User does not exist."
|
||||
end
|
||||
|
||||
sub_count = about.xpath_node(%q(//span[contains(text(), "subscribers")]))
|
||||
if sub_count
|
||||
sub_count = sub_count.content.delete(", subscribers").to_i?
|
||||
end
|
||||
sub_count ||= 0
|
||||
|
||||
author = about.xpath_node(%q(//span[@class="qualified-channel-title-text"]/a)).not_nil!.content
|
||||
ucid = about.xpath_node(%q(//link[@rel="canonical"])).not_nil!["href"].split("/")[-1]
|
||||
|
||||
@ -207,5 +213,37 @@ def get_about_info(ucid)
|
||||
auto_generated = true
|
||||
end
|
||||
|
||||
return {author, ucid, auto_generated}
|
||||
return {author, ucid, auto_generated, sub_count}
|
||||
end
|
||||
|
||||
def get_60_videos(ucid, page, auto_generated)
|
||||
count = 0
|
||||
videos = [] of SearchVideo
|
||||
|
||||
client = make_client(YT_URL)
|
||||
|
||||
2.times do |i|
|
||||
url = produce_channel_videos_url(ucid, page * 2 + (i - 1), auto_generated: auto_generated)
|
||||
response = client.get(url)
|
||||
json = JSON.parse(response.body)
|
||||
|
||||
if json["content_html"]? && !json["content_html"].as_s.empty?
|
||||
document = XML.parse_html(json["content_html"].as_s)
|
||||
nodeset = document.xpath_nodes(%q(//li[contains(@class, "feed-item-container")]))
|
||||
|
||||
if !json["load_more_widget_html"]?.try &.as_s.empty?
|
||||
count += 30
|
||||
end
|
||||
|
||||
if auto_generated
|
||||
videos += extract_videos(nodeset)
|
||||
else
|
||||
videos += extract_videos(nodeset, ucid)
|
||||
end
|
||||
else
|
||||
break
|
||||
end
|
||||
end
|
||||
|
||||
return videos, count
|
||||
end
|
||||
|
@ -104,21 +104,21 @@ def template_youtube_comments(comments)
|
||||
|
||||
html += <<-END_HTML
|
||||
<div class="pure-g">
|
||||
<div class="pure-u-2-24">
|
||||
<div class="pure-u-4-24 pure-u-md-2-24">
|
||||
<img style="width:90%; padding-right:1em; padding-top:1em;" src="#{author_thumbnail}">
|
||||
</div>
|
||||
<div class="pure-u-22-24">
|
||||
<div class="pure-u-20-24 pure-u-md-22-24">
|
||||
<p>
|
||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
||||
<b><a href="#{child["authorUrl"]}">#{child["author"]}</a></b>
|
||||
- #{recode_date(Time.epoch(child["published"].as_i64))} ago
|
||||
</p>
|
||||
<div>
|
||||
<b>
|
||||
<a href="#{child["authorUrl"]}">#{child["author"]}</a>
|
||||
</b>
|
||||
<p style="white-space:pre-wrap">#{child["contentHtml"]}</p>
|
||||
#{replies_html}
|
||||
</div>
|
||||
</div>
|
||||
#{recode_date(Time.epoch(child["published"].as_i64))} ago
|
||||
|
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{child["likeCount"]}
|
||||
</p>
|
||||
#{replies_html}
|
||||
</div>
|
||||
</div>
|
||||
END_HTML
|
||||
end
|
||||
@ -156,10 +156,10 @@ def template_reddit_comments(root)
|
||||
|
||||
content = <<-END_HTML
|
||||
<p>
|
||||
<a href="javascript:void(0)" onclick="toggle(this)">[ - ]</a>
|
||||
<i class="icon ion-ios-thumbs-up"></i> #{score}
|
||||
<a href="javascript:void(0)" onclick="toggle_parent(this)">[ - ]</a>
|
||||
<b><a href="https://www.reddit.com/user/#{author}">#{author}</a></b>
|
||||
- #{recode_date(child.created_utc)} ago
|
||||
#{score} points
|
||||
#{recode_date(child.created_utc)} ago
|
||||
</p>
|
||||
<div>
|
||||
#{body_html}
|
||||
|
@ -2,6 +2,7 @@ class Config
|
||||
YAML.mapping({
|
||||
crawl_threads: Int32,
|
||||
channel_threads: Int32,
|
||||
feed_threads: Int32,
|
||||
video_threads: Int32,
|
||||
db: NamedTuple(
|
||||
user: String,
|
||||
@ -14,6 +15,7 @@ class Config
|
||||
https_only: Bool?,
|
||||
hmac_key: String?,
|
||||
full_refresh: Bool,
|
||||
geo_bypass: Bool,
|
||||
})
|
||||
end
|
||||
|
||||
|
@ -93,6 +93,25 @@ def get_proxies(country_code = "US")
|
||||
return get_nova_proxies(country_code)
|
||||
end
|
||||
|
||||
def filter_proxies(proxies)
|
||||
proxies.select! do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
client.read_timeout = 10.seconds
|
||||
client.connect_timeout = 10.seconds
|
||||
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
client.set_proxy(proxy)
|
||||
|
||||
client.head("/").status_code == 200
|
||||
rescue ex
|
||||
false
|
||||
end
|
||||
end
|
||||
|
||||
return proxies
|
||||
end
|
||||
|
||||
def get_nova_proxies(country_code = "US")
|
||||
country_code = country_code.downcase
|
||||
client = HTTP::Client.new(URI.parse("https://www.proxynova.com"))
|
||||
@ -127,7 +146,7 @@ def get_nova_proxies(country_code = "US")
|
||||
proxies << {ip: ip, port: port, score: score}
|
||||
end
|
||||
|
||||
proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
||||
# proxies = proxies.sort_by { |proxy| proxy[:score] }.reverse
|
||||
return proxies
|
||||
end
|
||||
|
||||
|
@ -238,3 +238,9 @@ def write_var_int(value : Int)
|
||||
|
||||
return bytes
|
||||
end
|
||||
|
||||
def sha256(text)
|
||||
digest = OpenSSL::Digest.new("SHA256")
|
||||
digest << text
|
||||
return digest.hexdigest
|
||||
end
|
||||
|
@ -104,6 +104,44 @@ def refresh_videos(db)
|
||||
end
|
||||
end
|
||||
|
||||
def refresh_feeds(db, max_threads = 1)
|
||||
max_channel = Channel(Int32).new
|
||||
|
||||
spawn do
|
||||
max_threads = max_channel.receive
|
||||
active_threads = 0
|
||||
active_channel = Channel(Bool).new
|
||||
|
||||
loop do
|
||||
db.query("SELECT email FROM users") do |rs|
|
||||
rs.each do
|
||||
email = rs.read(String)
|
||||
view_name = "subscriptions_#{sha256(email)[0..7]}"
|
||||
|
||||
if active_threads >= max_threads
|
||||
if active_channel.receive
|
||||
active_threads -= 1
|
||||
end
|
||||
end
|
||||
|
||||
active_threads += 1
|
||||
spawn do
|
||||
begin
|
||||
db.exec("REFRESH MATERIALIZED VIEW #{view_name}")
|
||||
rescue ex
|
||||
STDOUT << "REFRESH " << email << " : " << ex.message << "\n"
|
||||
end
|
||||
|
||||
active_channel.send(true)
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
max_channel.send(max_threads)
|
||||
end
|
||||
|
||||
def pull_top_videos(config, db)
|
||||
if config.dl_api_key
|
||||
DetectLanguage.configure do |dl_config|
|
||||
@ -156,39 +194,14 @@ def update_decrypt_function
|
||||
end
|
||||
|
||||
def find_working_proxies(regions)
|
||||
proxy_channel = Channel({String, Array({ip: String, port: Int32})}).new
|
||||
loop do
|
||||
regions.each do |region|
|
||||
proxies = get_proxies(region).first(20)
|
||||
proxies = proxies.map { |proxy| {ip: proxy[:ip], port: proxy[:port]} }
|
||||
# proxies = filter_proxies(proxies)
|
||||
|
||||
regions.each do |region|
|
||||
spawn do
|
||||
loop do
|
||||
begin
|
||||
proxies = get_proxies(region).first(20)
|
||||
rescue ex
|
||||
next proxy_channel.send({region, Array({ip: String, port: Int32}).new})
|
||||
end
|
||||
|
||||
proxies.select! do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
client.read_timeout = 10.seconds
|
||||
client.connect_timeout = 10.seconds
|
||||
|
||||
proxy = HTTPProxy.new(proxy_host: proxy[:ip], proxy_port: proxy[:port])
|
||||
client.set_proxy(proxy)
|
||||
|
||||
client.get("/").status_code == 200
|
||||
rescue ex
|
||||
false
|
||||
end
|
||||
end
|
||||
proxies = proxies.map { |proxy| {ip: proxy[:ip], port: proxy[:port]} }
|
||||
|
||||
proxy_channel.send({region, proxies})
|
||||
end
|
||||
yield region, proxies
|
||||
Fiber.yield
|
||||
end
|
||||
end
|
||||
|
||||
loop do
|
||||
yield proxy_channel.receive
|
||||
end
|
||||
end
|
||||
|
@ -6,6 +6,7 @@ class MixVideo
|
||||
ucid: String,
|
||||
length_seconds: Int32,
|
||||
index: Int32,
|
||||
mixes: Array(String),
|
||||
})
|
||||
end
|
||||
|
||||
@ -34,6 +35,10 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
raise "Could not create mix."
|
||||
end
|
||||
|
||||
if !yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]?
|
||||
raise "Could not create mix."
|
||||
end
|
||||
|
||||
playlist = yt_data["contents"]["twoColumnWatchNextResults"]["playlist"]["playlist"]
|
||||
mix_title = playlist["title"].as_s
|
||||
|
||||
@ -59,7 +64,8 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
author,
|
||||
ucid,
|
||||
length_seconds,
|
||||
index
|
||||
index,
|
||||
[rdid]
|
||||
)
|
||||
end
|
||||
|
||||
@ -72,3 +78,37 @@ def fetch_mix(rdid, video_id, cookies = nil)
|
||||
videos = videos.first(50)
|
||||
return Mix.new(mix_title, rdid, videos)
|
||||
end
|
||||
|
||||
def template_mix(mix)
|
||||
html = <<-END_HTML
|
||||
<h3>
|
||||
<a href="/mix?list=#{mix["mixId"]}">
|
||||
#{mix["title"]}
|
||||
</a>
|
||||
</h3>
|
||||
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||
<ol class="pure-menu-list">
|
||||
END_HTML
|
||||
|
||||
mix["videos"].as_a.each do |video|
|
||||
html += <<-END_HTML
|
||||
<li class="pure-menu-item">
|
||||
<a href="/watch?v=#{video["videoId"]}&list=#{mix["mixId"]}">
|
||||
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||
<p style="width:100%">#{video["title"]}</p>
|
||||
<p>
|
||||
<b style="width: 100%">#{video["author"]}</b>
|
||||
</p>
|
||||
</a>
|
||||
</li>
|
||||
END_HTML
|
||||
end
|
||||
|
||||
html += <<-END_HTML
|
||||
</ol>
|
||||
</div>
|
||||
<hr>
|
||||
END_HTML
|
||||
|
||||
html
|
||||
end
|
||||
|
@ -26,11 +26,23 @@ class Playlist
|
||||
})
|
||||
end
|
||||
|
||||
def fetch_playlist_videos(plid, page, video_count)
|
||||
def fetch_playlist_videos(plid, page, video_count, continuation = nil)
|
||||
client = make_client(YT_URL)
|
||||
|
||||
if video_count > 100
|
||||
if continuation
|
||||
html = client.get("/watch?v=#{continuation}&list=#{plid}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
html = XML.parse_html(html.body)
|
||||
|
||||
index = html.xpath_node(%q(//span[@id="playlist-current-index"])).try &.content.to_i?
|
||||
if index
|
||||
index -= 1
|
||||
end
|
||||
index ||= 0
|
||||
else
|
||||
index = (page - 1) * 100
|
||||
end
|
||||
|
||||
if video_count > 100
|
||||
url = produce_playlist_url(plid, index)
|
||||
|
||||
response = client.get(url)
|
||||
@ -53,6 +65,11 @@ def fetch_playlist_videos(plid, page, video_count)
|
||||
nodeset = document.xpath_nodes(%q(.//tr[contains(@class, "pl-video")]))
|
||||
|
||||
videos = extract_playlist(plid, nodeset, 0)
|
||||
if continuation
|
||||
until videos[0].id == continuation
|
||||
videos.shift
|
||||
end
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@ -199,3 +216,37 @@ def fetch_playlist(plid)
|
||||
|
||||
return playlist
|
||||
end
|
||||
|
||||
def template_playlist(playlist)
|
||||
html = <<-END_HTML
|
||||
<h3>
|
||||
<a href="/playlist?list=#{playlist["playlistId"]}">
|
||||
#{playlist["title"]}
|
||||
</a>
|
||||
</h3>
|
||||
<div class="pure-menu pure-menu-scrollable playlist-restricted">
|
||||
<ol class="pure-menu-list">
|
||||
END_HTML
|
||||
|
||||
playlist["videos"].as_a.each do |video|
|
||||
html += <<-END_HTML
|
||||
<li class="pure-menu-item">
|
||||
<a href="/watch?v=#{video["videoId"]}&list=#{playlist["playlistId"]}">
|
||||
<img style="width:100%;" src="/vi/#{video["videoId"]}/mqdefault.jpg">
|
||||
<p style="width:100%">#{video["title"]}</p>
|
||||
<p>
|
||||
<b style="width: 100%">#{video["author"]}</b>
|
||||
</p>
|
||||
</a>
|
||||
</li>
|
||||
END_HTML
|
||||
end
|
||||
|
||||
html += <<-END_HTML
|
||||
</ol>
|
||||
</div>
|
||||
<hr>
|
||||
END_HTML
|
||||
|
||||
html
|
||||
end
|
||||
|
@ -119,6 +119,15 @@ def get_user(sid, client, headers, db, refresh = true)
|
||||
|
||||
db.exec("INSERT INTO users VALUES (#{args}) \
|
||||
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
||||
|
||||
begin
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||
SELECT * FROM channel_videos WHERE \
|
||||
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||
ORDER BY published DESC;")
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
else
|
||||
user = fetch_user(sid, client, headers, db)
|
||||
@ -129,6 +138,15 @@ def get_user(sid, client, headers, db, refresh = true)
|
||||
|
||||
db.exec("INSERT INTO users VALUES (#{args}) \
|
||||
ON CONFLICT (email) DO UPDATE SET id = users.id || $1, updated = $2, subscriptions = $4", user_array)
|
||||
|
||||
begin
|
||||
view_name = "subscriptions_#{sha256(user.email)[0..7]}"
|
||||
PG_DB.exec("CREATE MATERIALIZED VIEW #{view_name} AS \
|
||||
SELECT * FROM channel_videos WHERE \
|
||||
ucid = ANY ((SELECT subscriptions FROM users WHERE email = '#{user.email}')::text[]) \
|
||||
ORDER BY published DESC;")
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
|
||||
return user
|
||||
|
@ -456,7 +456,9 @@ class Video
|
||||
is_family_friendly: Bool,
|
||||
genre: String,
|
||||
genre_url: String,
|
||||
license: {
|
||||
license: String,
|
||||
sub_count_text: String,
|
||||
author_thumbnail: {
|
||||
type: String,
|
||||
default: "",
|
||||
},
|
||||
@ -477,6 +479,9 @@ class CaptionName
|
||||
)
|
||||
end
|
||||
|
||||
class VideoRedirect < Exception
|
||||
end
|
||||
|
||||
def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32}), refresh = true)
|
||||
if db.query_one?("SELECT EXISTS (SELECT true FROM videos WHERE id = $1)", id, as: Bool)
|
||||
video = db.query_one("SELECT * FROM videos WHERE id = $1", id, as: Video)
|
||||
@ -490,8 +495,8 @@ def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32})
|
||||
args = arg_array(video_array[1..-1], 2)
|
||||
|
||||
db.exec("UPDATE videos SET (info,updated,title,views,likes,dislikes,wilson_score,\
|
||||
published,description,language,author,ucid, allowed_regions, is_family_friendly,\
|
||||
genre, genre_url, license)\
|
||||
published,description,language,author,ucid,allowed_regions,is_family_friendly,\
|
||||
genre,genre_url,license,sub_count_text,author_thumbnail)\
|
||||
= (#{args}) WHERE id = $1", video_array)
|
||||
rescue ex
|
||||
db.exec("DELETE FROM videos * WHERE id = $1", id)
|
||||
@ -511,14 +516,18 @@ def get_video(id, db, proxies = {} of String => Array({ip: String, port: Int32})
|
||||
end
|
||||
|
||||
def fetch_video(id, proxies)
|
||||
html_channel = Channel(XML::Node).new
|
||||
html_channel = Channel(XML::Node | String).new
|
||||
info_channel = Channel(HTTP::Params).new
|
||||
|
||||
spawn do
|
||||
client = make_client(YT_URL)
|
||||
html = client.get("/watch?v=#{id}&bpctr=#{Time.new.epoch + 2000}&gl=US&hl=en&disable_polymer=1")
|
||||
html = XML.parse_html(html.body)
|
||||
|
||||
if md = html.headers["location"]?.try &.match(/v=(?<id>[a-zA-Z0-9_-]{11})/)
|
||||
next html_channel.send(md["id"])
|
||||
end
|
||||
|
||||
html = XML.parse_html(html.body)
|
||||
html_channel.send(html)
|
||||
end
|
||||
|
||||
@ -536,6 +545,11 @@ def fetch_video(id, proxies)
|
||||
end
|
||||
|
||||
html = html_channel.receive
|
||||
if html.as?(String)
|
||||
raise VideoRedirect.new("#{html.as(String)}")
|
||||
end
|
||||
html = html.as(XML::Node)
|
||||
|
||||
info = info_channel.receive
|
||||
|
||||
if info["reason"]? && info["reason"].includes? "your country"
|
||||
@ -543,6 +557,10 @@ def fetch_video(id, proxies)
|
||||
|
||||
proxies.each do |region, list|
|
||||
spawn do
|
||||
info = HTTP::Params.new({
|
||||
"reason" => [info["reason"]],
|
||||
})
|
||||
|
||||
list.each do |proxy|
|
||||
begin
|
||||
client = HTTPClient.new(YT_URL)
|
||||
@ -555,14 +573,16 @@ def fetch_video(id, proxies)
|
||||
info = HTTP::Params.parse(client.get("/get_video_info?video_id=#{id}&ps=default&eurl=&gl=US&hl=en&disable_polymer=1").body)
|
||||
if !info["reason"]?
|
||||
bypass_channel.send(proxy)
|
||||
else
|
||||
bypass_channel.send(nil)
|
||||
break
|
||||
end
|
||||
|
||||
break
|
||||
rescue ex
|
||||
end
|
||||
end
|
||||
|
||||
# If none of the proxies we tried returned a valid response
|
||||
if info["reason"]?
|
||||
bypass_channel.send(nil)
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
@ -641,11 +661,25 @@ def fetch_video(id, proxies)
|
||||
if license
|
||||
license = license.content
|
||||
else
|
||||
license ||= ""
|
||||
license = ""
|
||||
end
|
||||
|
||||
sub_count_text = html.xpath_node(%q(//span[contains(@class, "yt-subscriber-count")]))
|
||||
if sub_count_text
|
||||
sub_count_text = sub_count_text["title"]
|
||||
else
|
||||
sub_count_text = "0"
|
||||
end
|
||||
|
||||
author_thumbnail = html.xpath_node(%(//img[@alt="#{author}"]))
|
||||
if author_thumbnail
|
||||
author_thumbnail = author_thumbnail["data-thumb"]
|
||||
else
|
||||
author_thumbnail = ""
|
||||
end
|
||||
|
||||
video = Video.new(id, info, Time.now, title, views, likes, dislikes, wilson_score, published, description,
|
||||
nil, author, ucid, allowed_regions, is_family_friendly, genre, genre_url, license)
|
||||
nil, author, ucid, allowed_regions, is_family_friendly, genre, genre_url, license, sub_count_text, author_thumbnail)
|
||||
|
||||
return video
|
||||
end
|
||||
|
@ -13,23 +13,32 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<p class="h-box">
|
||||
<div class="h-box">
|
||||
<% if user %>
|
||||
<% if subscriptions.includes? ucid %>
|
||||
<a href="/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Unsubscribe from <%= author %></b>
|
||||
</a>
|
||||
<p>
|
||||
<a id="subscribe" onclick="unsubscribe()" class="pure-button pure-button-primary"
|
||||
href="/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Unsubscribe from <%= author %> <%= number_with_separator(sub_count) %></b>
|
||||
</a>
|
||||
</p>
|
||||
<% else %>
|
||||
<a href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Subscribe to <%= author %></b>
|
||||
</a>
|
||||
<p>
|
||||
<a id="subscribe" onclick="subscribe()" class="pure-button pure-button-primary"
|
||||
href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Subscribe to <%= author %> <%= number_with_separator(sub_count) %></b>
|
||||
</a>
|
||||
</p>
|
||||
<% end %>
|
||||
<% else %>
|
||||
<a href="/login?referer=<%= env.get("current_page") %>">
|
||||
<b>Login to subscribe to <%= author %></b>
|
||||
</a>
|
||||
<p>
|
||||
<a id="subscribe" class="pure-button pure-button-primary"
|
||||
href="/login?referer=<%= env.get("current_page") %>">
|
||||
<b>Login to subscribe to <%= author %></b>
|
||||
</a>
|
||||
</p>
|
||||
<% end %>
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<p class="h-box">
|
||||
<a href="https://www.youtube.com/channel/<%= ucid %>">View channel on YouTube</a>
|
||||
@ -51,8 +60,50 @@
|
||||
</div>
|
||||
<div class="pure-u-1 pure-u-md-3-5"></div>
|
||||
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
||||
<% if videos.size == 60 %>
|
||||
<% if count == 60 %>
|
||||
<a href="/channel/<%= ucid %>?page=<%= page + 1 %>">Next page</a>
|
||||
<% end %>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
document.getElementById("subscribe")["href"] = "javascript:void(0);"
|
||||
|
||||
function subscribe() {
|
||||
var url = "/subscription_ajax?action_create_subscription_to_channel=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
subscribe_button = document.getElementById("subscribe");
|
||||
subscribe_button.onclick = unsubscribe;
|
||||
subscribe_button.innerHTML = '<b>Unsubscribe from <%= author %> <%= number_with_separator(sub_count + 1) %></b>'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function unsubscribe() {
|
||||
var url = "/subscription_ajax?action_remove_subscriptions=1&c=<%= ucid %>&referer=<%= env.get("current_page") %>";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
subscribe_button = document.getElementById("subscribe");
|
||||
subscribe_button.onclick = subscribe;
|
||||
subscribe_button.innerHTML = '<b>Subscribe to <%= author %> <%= number_with_separator(sub_count) %></b>'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
</script>
|
@ -32,7 +32,7 @@
|
||||
<p><%= number_with_separator(item.video_count) %> videos</p>
|
||||
<p>PLAYLIST</p>
|
||||
<% when MixVideo %>
|
||||
<a style="width:100%;" href="/watch?v=<%= item.id %>">
|
||||
<a style="width:100%;" href="/watch?v=<%= item.id %>&list=<%= item.mixes[0] %>">
|
||||
<% if env.get?("user") && env.get("user").as(User).preferences.thin_mode %>
|
||||
<% else %>
|
||||
<img style="width:100%;" src="/vi/<%= item.id %>/mqdefault.jpg"/>
|
||||
|
@ -13,13 +13,13 @@
|
||||
<div class="pure-g h-box">
|
||||
<div class="pure-u-1 pure-u-md-1-5">
|
||||
<% if page >= 2 %>
|
||||
<a href="/search?q=<%= query %>&page=<%= page - 1 %>">Previous page</a>
|
||||
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page - 1 %>">Previous page</a>
|
||||
<% end %>
|
||||
</div>
|
||||
<div class="pure-u-1 pure-u-md-3-5"></div>
|
||||
<div style="text-align:right;" class="pure-u-1 pure-u-md-1-5">
|
||||
<% if count >= 20 %>
|
||||
<a href="/search?q=<%= query %>&page=<%= page + 1 %>">Next page</a>
|
||||
<a href="/search?q=<%= HTML.escape(query.not_nil!) %>&page=<%= page + 1 %>">Next page</a>
|
||||
<% end %>
|
||||
</div>
|
||||
</div>
|
||||
|
@ -22,6 +22,7 @@
|
||||
<meta name="twitter:player" content="<%= host_url %>/embed/<%= video.id %>">
|
||||
<meta name="twitter:player:width" content="1280">
|
||||
<meta name="twitter:player:height" content="720">
|
||||
<script src="/js/watch.js"></script>
|
||||
<%= rendered "components/player_sources" %>
|
||||
<title><%= HTML.escape(video.title) %> - Invidious</title>
|
||||
<% end %>
|
||||
@ -91,20 +92,23 @@
|
||||
<% if user %>
|
||||
<% if subscriptions.includes? video.ucid %>
|
||||
<p>
|
||||
<a href="/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Unsubscribe from <%= video.author %></b>
|
||||
<a id="subscribe" onclick="unsubscribe()" class="pure-button pure-button-primary"
|
||||
href="/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Unsubscribe from <%= video.author %> <%= video.sub_count_text %></b>
|
||||
</a>
|
||||
</p>
|
||||
<% else %>
|
||||
<p>
|
||||
<a href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Subscribe to <%= video.author %></b>
|
||||
<a id="subscribe" onclick="subscribe()" class="pure-button pure-button-primary"
|
||||
href="/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>">
|
||||
<b>Subscribe to <%= video.author %> <%= video.sub_count_text %></b>
|
||||
</a>
|
||||
</p>
|
||||
<% end %>
|
||||
<% else %>
|
||||
<p>
|
||||
<a href="/login?referer=<%= env.get("current_page") %>">
|
||||
<a id="subscribe" class="pure-button pure-button-primary"
|
||||
href="/login?referer=<%= env.get("current_page") %>">
|
||||
<b>Login to subscribe to <%= video.author %></b>
|
||||
</a>
|
||||
</p>
|
||||
@ -117,11 +121,15 @@
|
||||
</div>
|
||||
<hr>
|
||||
<div id="comments">
|
||||
<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="pure-u-1 pure-u-md-1-5">
|
||||
<% if plid %>
|
||||
<div id="playlist" class="h-box">
|
||||
</div>
|
||||
<% end %>
|
||||
|
||||
<% if !preferences || preferences && preferences.related_videos %>
|
||||
<div class="h-box">
|
||||
<% rvs.each do |rv| %>
|
||||
@ -144,38 +152,118 @@
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function toggle(target) {
|
||||
body = target.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
subscribe_button = document.getElementById("subscribe");
|
||||
if (subscribe_button.getAttribute('onclick')) {
|
||||
subscribe_button["href"] = "javascript:void(0);";
|
||||
}
|
||||
|
||||
function toggle_comments(target) {
|
||||
body = target.parentNode.parentNode.parentNode.children[1];
|
||||
if (body.style.display === null || body.style.display === "") {
|
||||
target.innerHTML = "[ + ]";
|
||||
body.style.display = "none";
|
||||
} else {
|
||||
target.innerHTML = "[ - ]";
|
||||
body.style.display = "";
|
||||
}
|
||||
function subscribe() {
|
||||
var url = "/subscription_ajax?action_create_subscription_to_channel=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
subscribe_button = document.getElementById("subscribe");
|
||||
subscribe_button.onclick = unsubscribe;
|
||||
subscribe_button.innerHTML = '<b>Unsubscribe from <%= video.author %> <%= video.sub_count_text %></b>'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
function get_youtube_replies(target) {
|
||||
var continuation = target.getAttribute("data-continuation");
|
||||
function unsubscribe() {
|
||||
var url = "/subscription_ajax?action_remove_subscriptions=1&c=<%= video.ucid %>&referer=<%= env.get("current_page") %>";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.send();
|
||||
|
||||
var body = target.parentNode.parentNode;
|
||||
var fallback = body.innerHTML;
|
||||
body.innerHTML =
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
subscribe_button = document.getElementById("subscribe");
|
||||
subscribe_button.onclick = subscribe;
|
||||
subscribe_button.innerHTML = '<b>Subscribe to <%= video.author %> <%= video.sub_count_text %></b>'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
<% if plid %>
|
||||
function get_playlist() {
|
||||
playlist = document.getElementById("playlist");
|
||||
playlist.innerHTML = ' \
|
||||
<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3> \
|
||||
<hr>'
|
||||
|
||||
var plid = "<%= plid %>"
|
||||
|
||||
if (plid.startsWith("RD")) {
|
||||
var plid_url = "/api/v1/mixes/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||
} else {
|
||||
var plid_url = "/api/v1/playlists/<%= plid %>?continuation=<%= video.id %>&format=html";
|
||||
}
|
||||
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", plid_url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
playlist.innerHTML = xhr.response.playlistHtml;
|
||||
|
||||
if (xhr.response.nextVideo) {
|
||||
player.on('ended', function() {
|
||||
window.location.replace("/watch?v="
|
||||
+ xhr.response.nextVideo
|
||||
+ "&list=<%= plid %>"
|
||||
<% if params[:listen] %>
|
||||
+ "&listen=1"
|
||||
<% end %>
|
||||
<% if params[:autoplay] %>
|
||||
+ "&autoplay=1"
|
||||
<% end %>
|
||||
<% if params[:speed] %>
|
||||
+ "&speed=<%= params[:speed] %>"
|
||||
<% end %>
|
||||
);
|
||||
});
|
||||
}
|
||||
} else {
|
||||
playlist.innerHTML = "";
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log("Pulling playlist timed out.");
|
||||
|
||||
comments = document.getElementById("playlist");
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3><hr>';
|
||||
get_playlist();
|
||||
};
|
||||
}
|
||||
|
||||
get_playlist();
|
||||
<% end %>
|
||||
|
||||
function get_reddit_comments() {
|
||||
comments = document.getElementById("comments");
|
||||
var fallback = comments.innerHTML;
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
var url =
|
||||
"/api/v1/comments/<%= video.id %>?format=html&continuation=" + continuation;
|
||||
var url = "/api/v1/comments/<%= video.id %>?source=reddit&format=html";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
@ -185,38 +273,19 @@ function get_youtube_replies(target) {
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
body.innerHTML = xhr.response.contentHtml;
|
||||
} else {
|
||||
body.innerHTML = fallback;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log("Pulling comments timed out.");
|
||||
|
||||
body.innerHTML = fallback;
|
||||
};
|
||||
}
|
||||
|
||||
function get_reddit_comments() {
|
||||
var url = "/api/v1/comments/<%= video.id %>?source=reddit&format=html";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
xhr.timeout = 20000;
|
||||
xhr.open("GET", url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4)
|
||||
if (xhr.status == 200) {
|
||||
comments = document.getElementById("comments");
|
||||
comments.innerHTML = ' \
|
||||
<div> \
|
||||
<h3> \
|
||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||
{title} \
|
||||
</h3> \
|
||||
<p> \
|
||||
<b> \
|
||||
<a href="javascript:void(0)" onclick="swap_comments(\'youtube\')"> \
|
||||
View YouTube comments \
|
||||
</a> \
|
||||
</b> \
|
||||
</p> \
|
||||
<b> \
|
||||
<a rel="noopener" target="_blank" href="https://reddit.com{permalink}">View more comments on Reddit</a> \
|
||||
</b> \
|
||||
@ -231,10 +300,10 @@ function get_reddit_comments() {
|
||||
<% if preferences && preferences.comments[1] == "youtube" %>
|
||||
get_youtube_comments();
|
||||
<% else %>
|
||||
comments = document.getElementById("comments");
|
||||
comments.innerHTML = "";
|
||||
comments.innerHTML = fallback;
|
||||
<% end %>
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
@ -245,6 +314,11 @@ function get_reddit_comments() {
|
||||
}
|
||||
|
||||
function get_youtube_comments() {
|
||||
comments = document.getElementById("comments");
|
||||
var fallback = comments.innerHTML;
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
var url = "/api/v1/comments/<%= video.id %>?format=html";
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = "json";
|
||||
@ -253,9 +327,8 @@ function get_youtube_comments() {
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4)
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
comments = document.getElementById("comments");
|
||||
if (xhr.response.commentCount > 0) {
|
||||
comments.innerHTML = ' \
|
||||
<div> \
|
||||
@ -263,6 +336,11 @@ function get_youtube_comments() {
|
||||
<a href="javascript:void(0)" onclick="toggle_comments(this)">[ - ]</a> \
|
||||
View {commentCount} comments \
|
||||
</h3> \
|
||||
<b> \
|
||||
<a href="javascript:void(0)" onclick="swap_comments(\'reddit\')"> \
|
||||
View Reddit comments \
|
||||
</a> \
|
||||
</b> \
|
||||
</div> \
|
||||
<div>{contentHtml}</div> \
|
||||
<hr>'.supplant({
|
||||
@ -276,35 +354,59 @@ function get_youtube_comments() {
|
||||
<% if preferences && preferences.comments[1] == "youtube" %>
|
||||
get_youtube_comments();
|
||||
<% else %>
|
||||
comments = document.getElementById("comments");
|
||||
comments.innerHTML = "";
|
||||
<% end %>
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log("Pulling comments timed out.");
|
||||
|
||||
comments = document.getElementById("comments");
|
||||
comments.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
get_youtube_comments();
|
||||
};
|
||||
}
|
||||
|
||||
function commaSeparateNumber(val){
|
||||
while (/(\d+)(\d{3})/.test(val.toString())){
|
||||
val = val.toString().replace(/(\d+)(\d{3})/, '$1'+','+'$2');
|
||||
}
|
||||
return val;
|
||||
}
|
||||
function get_youtube_replies(target) {
|
||||
var continuation = target.getAttribute('data-continuation');
|
||||
|
||||
String.prototype.supplant = function(o) {
|
||||
return this.replace(/{([^{}]*)}/g, function(a, b) {
|
||||
var r = o[b];
|
||||
return typeof r === "string" || typeof r === "number" ? r : a;
|
||||
});
|
||||
};
|
||||
var body = target.parentNode.parentNode;
|
||||
var fallback = body.innerHTML;
|
||||
body.innerHTML =
|
||||
'<h3><center class="loading"><i class="icon ion-ios-refresh"></i></center></h3>';
|
||||
|
||||
var url = '/api/v1/comments/<%= video.id %>?format=html&continuation=' +
|
||||
continuation;
|
||||
var xhr = new XMLHttpRequest();
|
||||
xhr.responseType = 'json';
|
||||
xhr.timeout = 20000;
|
||||
xhr.open('GET', url, true);
|
||||
xhr.send();
|
||||
|
||||
xhr.onreadystatechange = function() {
|
||||
if (xhr.readyState == 4) {
|
||||
if (xhr.status == 200) {
|
||||
body.innerHTML = ' \
|
||||
<p><a href="javascript:void(0)" \
|
||||
onclick="hide_youtube_replies(this)">Hide replies \
|
||||
</a></p> \
|
||||
<div>{contentHtml}</div>'.supplant({
|
||||
contentHtml: xhr.response.contentHtml,
|
||||
});
|
||||
} else {
|
||||
body.innerHTML = fallback;
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
xhr.ontimeout = function() {
|
||||
console.log('Pulling comments timed out.');
|
||||
|
||||
body.innerHTML = fallback;
|
||||
};
|
||||
}
|
||||
|
||||
<% if preferences %>
|
||||
<% if preferences.comments[0] == "youtube" %>
|
||||
|
Reference in New Issue
Block a user