Create a file in a folder with a list of podcast urls and their names. E.g.
[url] [name][url2] [name2]
Then run the following shell script which I stole and modified from someone on GitHub. The script will download the latest episode of each podcast. It also store the url's in a log file and check, whenever you run it, to ensure that it doesn't download the same episode twice.
#!/bin/bash# Make script crontab friendly:cd $(dirname $0)# datadir is the directory you want podcasts saved to:datadir=$HOME/audio/podcasts# create datadir if necessary:mkdir -p $datadir# Delete any temp file:rm -f temp.log# Create new playlistecho "#Last fetch on $(date +%Y-%m-%d) @ $(date +%r)" > $datadir/latest.m3u# Read the bp.conf file and wget any url not already in the podcast.log file:while read podcastfields do podcast=$(echo $podcastfields | cut -d' ' -f1) dname=$(echo $podcastfields | cut -d' ' -f2) file=$(xsltproc parse_enclosure.xsl $podcast 2> /dev/null | head -n1 || wget -q $podcast -O - | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p') mkdir -p $datadir/$dname for url in $file do filename=$(echo $url | awk -F'/' '{print $NF}') echo $filename >> temp.log if ! grep "$filename" podcast.log > /dev/null then wget -t 10 -U BashPodder -c -q -O $datadir/$dname/$(echo "$url" | awk -F'/' {'print $NF'} | awk -F'=' {'print $NF'} | awk -F'?' {'print $1'}) "$url" echo $datadir/$dname/$filename >> $datadir/latest.m3u fi done done < bp.conf# Move dynamically created log file to permanent log file:cat podcast.log >> temp.logsort temp.log | uniq > podcast.logrm temp.log# Create an m3u playlist:#ls $datadir | grep -v m3u > $datadir/podcast.m3u
Also, I have an open question for the thread. I tried to work this out but couldn't:
Forgot to say that the file should be named "bp.conf"
Ryan White
Anyway, I have a lot of scripts. Ask if one seems interesting: 8ch-thread-dl.sharchive-convert.sharchive-merge.sharchive.shbackup.shblur-lock.shbrowser.shbsdman.shcbz-thumbnailer.shcpu-set-governor.shcuesplit.sheclean-kernel.shemake-kernel.shextract.shffmpeg-capture.shffmpeg-screenshot.shflac-to-webm.shflatten-dir.shhttrack-wikimedia.shhttrack-wrapper.shimagefap.shim-diff.shim-scale.shkobo-convert.shloop.shmanga-reader.shmpc-add-random-album.shmpc-del-album.shmpc-view.shmtg-deck-collage.shmtg-deck-size.shmusic-convert.shoptimise-images.shparallel.shrename-mtime.shscreenshot.shsed-rename.shtime-avg.shvs-image.shcycle-wallpaper.shwebm.shzero-pad.sh
Blake Adams
I am using this script that I made myself to download videos from channels that I like. Suggestions on how to improve it are welcome. #!/bin/bashfunction main(){ downloadVideo TempleOS youtube.com/channel/UCdX4uJUwSFwiY3XBvu-F_-Q downloadVideo PhysicsGirl youtube.com/user/physicswoman}function downloadVideo(){ NAME=$1 URL=$2 DIRECTORY=$(pwd) FOLDER="./videos/"$NAME mkdir -p $FOLDER cd $FOLDER youtube-dl --format "webm/mp4" --download-archive archive.txt -i $URL cd $DIRECTORY}main
First argument to downloadVideo is name of folder that you want to download to and second is link to video, channel or playlist. Youtube dl automatically downloads all videos from channel or playlist, you do not need to manually get links.
youtube-dl -i --extract-audio --playlist-items 1 --batch-file bp.conf bp.conf should contain links to channels where podcasts are. You might need to use "--download-archive archive.txt" to prevent youtube-dl from re-downloading things every time that you start script and "--audio-format FORMAT" to select format that you want.
Brandon Jenkins
I got this off a previous thread. though it was nifty #!/bin/bashdec_1=100000 # The value of 1 in our "decimal" variablespi=314159 # pi (scaled by dec_1)sine_iters=5 # Accuracy of the sine function. No real "units" to this, configure it experimentally.sine() { local acc=0 fact=1 x=$1 xpow=$1 i=0 while ((i < sine_iters)); do ((acc += xpow / fact, i++, fact *= -2*i * (2*i + 1), xpow = xpow * x / dec_1 * x / dec_1 )) done echo "$acc"}wave_str='***** +++ 'wave_pos=0while true; do # Scale up from our sine wave (-1..1 dec) to 0..40 spaces=$((20 * (dec_1 + $(sine "$wave_pos")) / dec_1)) printf '\n%*s%s' "$spaces" '' "$wave_str" wave_str=${wave_str:1}${wave_str:0:1} ((wave_pos = (wave_pos + dec_1/10))) # Wrap wave_pos around if it reaches pi # Avoids the usual problem with Taylor series approximations (lower accuracy for higher magnitudes of x) if ((wave_pos > pi)); then ((wave_pos -= pi * 2)) fi sleep 0.025done
Lucas Reyes
Weird, it works similarly to a GH project that I once made that does a similar thing, but specifically with YouTube RSS feeds.
Carson White
I've always known you are bunch of fags. Also camel case in bash.
Caleb Kelly
Just made this. The automatic fan control on my ASUS K73SV was kinda shit, the fans got noisy at just 60C. This script makes the fans quieter and less volatile at the cost of increased cpu temps. Not sure how portable this is, but with some adjustments it might work on other laptops with pwm control. Once you successfully run this, either make sure it keeps running or change pwm1_enable back to 2. Needs root privileges. Note that it's mksh, not bash. Feedback welcome, I'm no guru. #!/bin/mkshcd /sys/class/hwmon/hwmon2if [[ -e pwm1_enable && -e pwm1 && -e temp1_input ]] ; then echo 1 >> pwm1_enableelse exit 1fiwhile truedo temp=$(cat temp1_input) pwm=$(cat pwm1) if [[ $temp -le 55000 ]] ; then echo 0 >> pwm1 elif [[ $temp -gt 90000 ]] ; then echo 200 >> pwm1 elif [[ $temp -gt 85000 ]] ; then if [[ $pwm -ne 200 ]] ; then echo 150 >> pwm1 fi elif [[ $temp -gt 80000 ]] ; then if [[ $pwm -ne 150 ]] ; then echo 120 >> pwm1 fi elif [[ $temp -gt 75000 ]] ; then if [[ $pwm -ne 120 ]] ; then echo 100 >> pwm1 fi elif [[ $temp -gt 70000 ]] ; then if [[ $pwm -ne 100 ]] ; then echo 80 >> pwm1 fi elif [[ $temp -gt 65000 ]] ; then echo 80 >> pwm1 elif [[ $temp -gt 60000 ]] ; then if [[ $pwm -ne 80 ]] ; then echo 60 >> pwm1 fi elif [[ $temp -gt 55000 ]] ; then echo 60 >> pwm1 else #this shouldn't happen, fast mode just in case echo 200 >> pwm1 fi sleep 5done
An ipfs replacement for all those bullshit puush/imgur/dropbox/pastebin scripts that break after a few months. Keeps a local log of uploads to make it easier to find them later: #!/bin/sh -eu: ${IPFS_CMD:=ipfs --api /ip6/::1/tcp/5001}: ${IPFS_HOST:=ipfs.io}file_path="$1"ipfs_path="/ipfs/$(${IPFS_CMD} add -q "${file_path}" | tail -1)"upload_dir="/uploads"upload_path="${upload_dir}/$(date +%F_%T)_$(basename ${file_path})"https_url="${IPFS_HOST}${ipfs_path}"${IPFS_CMD} files stat "${upload_dir}" >/dev/null \ || ${IPFS_CMD} files mkdir "${upload_dir}"${IPFS_CMD} files cp "${ipfs_path}" "${upload_path}"echo "files API: ${upload_path}"echo "Link: ${https_url}"xclip
Guess I should explain a little: -ffmpeg-capture: just click on a window and it starts recording -flac-to-webm.sh: be careful, the webm produced are special. They can't be seeked and start playing only after full buffering. Minimal size, though. -music-convert.sh: recursively convert a folder -zero-pad.sh: pretty explicit (I hope).
Nolan Mitchell
Quote your variables faggots.
Hunter Taylor
Poor man's `sponge' (moreutils). A tool that soaks up standard input, which allows reading and writing from/to the same like this: uniq something.txt | schwamm something.txt
You should clean up your code a bit here. Make the referneces to files in their full name (I recommend either using ~/.config/bp/conf or /etc/bp/conf. And put the logs in /var/logs. I tried to clean it up that way but it is not downloading correctly anymore.
#!/bin/bash# Make script crontab friendly:cd $(dirname $0)# datadir is the directory you want podcasts saved to:datadir=$HOME/Music/podcasts# confdir is where your conf files will goconfdir=$HOME/.config/bp# logdir is the master log file dirlogdir=$confdir/logs# create datadir if necessary:mkdir -p $datadir# create conf file if nessasaryif [[ -d $confdir ]]; thensleep 0else mkdir -p $confdir touch $confdir/bp.conf echo "URL Name" >$confdir/bp.conf echo -e "You need to configure your conf file, located in $confdir/bp.conf" exit 1fi# create log file if nessasaryif [[ -d $logdir ]]; then if [[ -f $logdir/podcast.log ]]; then sleep 0 else touch $logdir/podcast.log fi sleep 0else mkdir -p $logdir touch $logdir/podcast.logfi# Create new playlistecho "#Last fetch on $(date +%Y-%m-%d) @ $(date +%r)" > $datadir/latest.m3u# Read the bp.conf file and wget any url not already in the podcast.log file:while read podcastfields do podcast=$(echo $podcastfields | cut -d' ' -f1) dname=$(echo $podcastfields | cut -d' ' -f2) file=$(xsltproc parse_enclosure.xsl $podcast 2> /dev/null | head -n1 || wget -q $podcast -O - | tr '\r' '\n' | tr \' \" | sed -n 's/.*url="\([^"]*\)".*/\1/p') mkdir -p $datadir/$dname for url in $file do filename=$(echo $url | awk -F'/' '{print $NF}') echo $filename >> /tmp/temp.log if ! grep "$filename" $logdir/podcast.log > /dev/null then wget -t 10 -U BashPodder -c -q -O $datadir/$dname/$(echo "$url" | awk -F'/' {'print $NF'} | awk -F'=' {'print $NF'} | awk -F'?' {'print $1'}) "$url" echo $datadir/$dname/$filename >> $datadir/latest.m3u fi done done < $confdir/bp.conf# Move dynamically created log file to permanent log file:cat $logdir/podcast.log >> /tmp/temp.logsort /tmp/temp.log | uniq > $logdir/podcast.logrm /tmp/temp.log# Create an m3u playlist:#ls $datadir | grep -v m3u > $datadir/podcast.m3u
Evan Allen
Use read to split the line into fields instead of cut. while read podcast dname
Will not work anymore, the url format changed. For example the url of the OP is /file_store/5d770667648c0 5474708c4f98b04dc81f4e57133cbd63f864903f71a33b6c244.png Use: -I '*/src,/file_store' Use parameter expansion: "${1##*/}"
Evan Martinez
Guess you're right. That's the only script I didn't make myself, so that explain.
Hudson Lopez
You got a script that convert any file into an mp4, and another that download the same video on YouTube and reverse the process?
Chase Flores
A case statement could be used instead of the repetitive if else chaincase $(((temp - 1)/1000)) in90) ... ;;85) ... ;;80) ... ;;75) ... ;;70) ... ;;65) ... ;;60) ... ;;55) ... ;;*) ... ;;esac