LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 07-19-2017, 02:00 PM   #1
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Rep: Reputation: Disabled
Curl/Wget with Kodi -- 403 Forbidden code returned


Hi ,
I wrote a script that allows me to use CURL or WGET to have information on streaming links. I managed to write the script for the streaming links that are hosted in the streaming website, and I was able to get the information from the servers (the links are: URL/File.mp4 , Also I use the command WGET when I want to download the link.
My problem is: how to use CURL to get a response that the link exists and it is valid in the server like this link: ( i got the links from KODI )
I mean that i want to use CURL with kodi links to get information from the server

The purpose of the request is how to prove that the link exists. With the curl command, I have a forbidden return 403 while the link is functional via kodi. Here is my script and example of a link for example :
*URL -->http:// dittotv.live-s.cdn.bitgravity .com /cdn-live/_definst_/dittotv/ secure/zee_cinema_hd_Web.smil/ playlist.m3u8
someone have an idea ?
Script that i use :

Code:

Code:
#!/bin/bash
declare ans2=Y;
while [ $ans2 = "Y" ];
do
read -p "URL to check: " url
if curl -v -i --output /dev/null --silent --fail "$url"; then
  printf  "$url --> The link exist !!:"
else
  printf "$url --> The link does not exist !!"
fi
printf 'Want you show the cURL information from the Streaming Link? (Y/N/Q):'
read -p " Your Answer :" ans
if [ $ans = "Q" ]; then 
exit 
fi
if [ $ans = "Y" ]; then curl -v -i "$url"
else printf 'OK ! No Prob ! -->  Next Question:' 
fi
printf 'Want You download the streaming video from the streaming server? (Y/N/Q):'
read -p "(Y/N/Q):" ans3
if [ $ans3 = "Q" ]; then 
exit 
fi
while [ $ans3 = "Y" ]
do
if curl --output /dev/null --silent --head --fail "$url"; then
wget "$url"
else 
printf "$red" 'The link is Down ! No file to download'
fi
exit
done
if [ $ans3 = "N" ]; then
printf 'OK ! No Prob ! -->  Next Question:'
fi
printf 'Want You check another URL ? (Y/N):'
read -p "(Y/N):" ans2
if [ $ans2 = "N" ] ; then 
printf "$red" "Good Bye - Thank you !!"
fi
done
 
Old 07-20-2017, 12:58 PM   #2
scasey
LQ Veteran
 
Registered: Feb 2013
Location: Tucson, AZ, USA
Distribution: CentOS 7.8.2003
Posts: 5,424

Rep: Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054
Quote:
Originally Posted by alielkamel View Post
Hi ,
IHere is my script and example of a link for example :
*URL -->http:// dittotv.live-s.cdn.bitgravity .com /cdn-live/_definst_/dittotv/ secure/zee_cinema_hd_Web.smil/ playlist.m3u8
That URL contains spaces. I would expect that to terminate the value in the $url variable at read time resulting in an invalid request.
 
1 members found this post helpful.
Old 07-20-2017, 02:48 PM   #3
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by scasey View Post
That URL contains spaces. I would expect that to terminate the value in the $url variable at read time resulting in an invalid request.
Thank you for your answer.
The URL: http://dittotv.live-s.cdn.bitgravity.../playlist.m3u8 ou une autre avec la meme erreur ( 403 Forbidden ) : http://arabolivo.gcdn.co/LQ-AR-MBC-MASR/index.m3u8

I think the forum rules do not allow me to put URLs in my posts before I post more what 5 posts.
 
Old 07-20-2017, 08:43 PM   #4
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,513

Rep: Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009
You're probably receiving the error from curl, because of the way the server is configured. Wget is a bit more elegant in its maneuvers.
 
Old 07-20-2017, 10:11 PM   #5
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by AwesomeMachine View Post
You're probably receiving the error from curl, because of the way the server is configured. Wget is a bit more elegant in its maneuvers.

So in your opinion wget is more effective in this case? If yes you have idea how I can do it?
Thanks for your help
 
Old 07-21-2017, 02:54 PM   #6
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by AwesomeMachine View Post
You're probably receiving the error from curl, because of the way the server is configured. Wget is a bit more elegant in its maneuvers.
i used wget but same prob . can't acces
Code:
read -p "URL to check: " url
wget -q --spider "$url"
echo $?
i have a code return = 8 when i put my URL
 
Old 07-21-2017, 06:57 PM   #7
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,513

Rep: Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009
Wget has a spider mode that just checks the link. Otherwise, the server is configured to prevent link checking.
 
Old 07-21-2017, 07:30 PM   #8
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by AwesomeMachine View Post
Wget has a spider mode that just checks the link. Otherwise, the server is configured to prevent link checking.
Yes that's it but do you have an idea? Or a proposal? I want to find a solution so that I can verify links
 
Old 07-21-2017, 09:57 PM   #9
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,513

Rep: Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009Reputation: 1009
Try the wget spider mode. If it won't work, then the server is configured to prevent link checking. In the latter case, it will never work! There is nothing you can do to make it work unless you run the server. But try wget spider mode before giving up.
 
Old 07-22-2017, 09:08 AM   #10
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by AwesomeMachine View Post
Try the wget spider mode. If it won't work, then the server is configured to prevent link checking. In the latter case, it will never work! There is nothing you can do to make it work unless you run the server. But try wget spider mode before giving up.

Thanks , but i tried this one :
read -p "URL to check: " url
wget -q --spider "$url"
echo $?

I wanted 0 or 1 but i received 8 as code returned
 
Old 07-22-2017, 09:15 AM   #11
BW-userx
LQ Guru
 
Registered: Sep 2013
Location: Somewhere in my head.
Distribution: Slackware (current), FreeBSD, Win10, It varies
Posts: 9,952

Rep: Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148
should read

Code:
printf "$red" "Thank you, Good Bye and Good Day sir!!"
in english I do believe that ones thanks the other first before sending him/her off on his/her way. Not the other way around.

Last edited by BW-userx; 07-22-2017 at 09:17 AM.
 
Old 07-22-2017, 10:01 AM   #12
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by BW-userx View Post
should read

Code:
printf "$red" "Thank you, Good Bye and Good Day sir!!"
in english I do believe that ones thanks the other first before sending him/her off on his/her way. Not the other way around.
Yes yes Is what you propose a solution for my problem? thank you for your time !
 
Old 07-22-2017, 10:09 AM   #13
BW-userx
LQ Guru
 
Registered: Sep 2013
Location: Somewhere in my head.
Distribution: Slackware (current), FreeBSD, Win10, It varies
Posts: 9,952

Rep: Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148Reputation: 2148
Quote:
Originally Posted by alielkamel View Post
Yes yes Is what you propose a solution for my problem? thank you for your time !
no, but it doesn't hurt to help (you) wherever needed, either.
 
Old 07-22-2017, 11:37 AM   #14
scasey
LQ Veteran
 
Registered: Feb 2013
Location: Tucson, AZ, USA
Distribution: CentOS 7.8.2003
Posts: 5,424

Rep: Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054Reputation: 2054
Quote:
Originally Posted by alielkamel View Post
Thanks , but i tried this one :
read -p "URL to check: " url
wget -q --spider "$url"
echo $?

I wanted 0 or 1 but i received 8 as code returned
Suggest removing the -q (quiet) option to see a verbose response. Might explain what's happening. That said, a web search for "wget returned 8" returned;
Code:
Wget may return one of several error codes if it encounters problems.
0
    No problems occurred.
1
    Generic error code.
2
    Parse error—for instance, when parsing command-line options, the ‘.wgetrc’ or ‘.netrc’...
3
    File I/O error.
4
    Network failure.
5
    SSL verification failure.
6
    Username/password authentication failure.
7
    Protocol errors.
8
    Server issued an error response.
So, turning off quiet mode might show you what that error was.

Have you read?
Code:
info wget
 
Old 07-22-2017, 12:01 PM   #15
alielkamel
LQ Newbie
 
Registered: Jul 2017
Posts: 12

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by scasey View Post
Suggest removing the -q (quiet) option to see a verbose response. Might explain what's happening. That said, a web search for "wget returned 8" returned;
Code:
Wget may return one of several error codes if it encounters problems.
0
    No problems occurred.
1
    Generic error code.
2
    Parse error—for instance, when parsing command-line options, the ‘.wgetrc’ or ‘.netrc’...
3
    File I/O error.
4
    Network failure.
5
    SSL verification failure.
6
    Username/password authentication failure.
7
    Protocol errors.
8
    Server issued an error response.
So, turning off quiet mode might show you what that error was.


Have you read?
Code:
info wget
Yes i read it but i think that the server allow only the acces for kodi agent because with kodi i can watch the stream . I used with curl with kodi agent but always 403 forbidden
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Please Help : Curl/Wget with Kodi -- 403 Forbidden code returned alielkamel Linux - Software 1 07-20-2017 08:03 AM
wget - using --user-agent option still results in 403/forbidden error tensigh Linux - Software 14 01-21-2013 12:48 PM
wget ERROR 403: Forbidden markm0705 Linux - Newbie 3 05-08-2011 05:47 PM
Wget or cURL code for checking changes to a web page? ewingtux Programming 2 12-16-2008 04:46 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 03:51 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration