Hello. This is my first post. I hope all is well. Thank you for making this program, it's nice. It's the only one I have used besides MTG games on XBLA circa 2012 etc. and it's pretty awesome.
I ran into the same bug as muaddib above. It seems that the website domain has changed to scryfall and not only that, but filenames and searching don't match anymore, so the Magarena program fails to download the card images.
I took a look at the source code, and I must admit, I don't fully understand how everything is tied in together. I see a lot of awk scripts. I haven't used that in decades. If you would like help, I can help, but I'd need more information on how to contribute.
I've read the posts about copyright infringment and I think I understand them, and in case I have missed, please advise accordingly.
As for the bug above, I coded a little program in golang to download card images from scryfall
It seems that the card image url has changed to using some UU encoded string. So, I wrote a basic scraper using goquery.
This is the version that works (both versions compile).
- Code: Select all
package main
import (
"fmt"
"github/PuerkitoBio/goquery"
"io"
"log"
"net/http"
"os"
"strings"
)
func getCardData() {
args := os.Args
if len(args) < 1 {
fmt.Println("Usage: mtg Name Of Card");
os.Exit(1);
}
name_of_card := os.Args[1] // name of card from command line
url := fmt.Sprintf("scryfall/search?q=%v", name_of_card)
doc, err := goquery.NewDocument(url)
if err != nil {
log.Fatal(err)
}
// Find the review items
doc.Find("a.card-grid-item-card").Each(func(i int, s *goquery.Selection) {
card_img_url, card_img_url_exists := s.Find("img.card").Attr("src")
card_name := s.Find(".card-grid-item-invisible-label").Text()+".jpg"
if card_img_url_exists {
fmt.Printf("%v : %v\n", strings.TrimSpace(card_name), strings.TrimSpace(card_img_url))
downloadFile(card_img_url, "cards/"+card_name)
}
})
}
func downloadFile(url string, filepath string){
os.Mkdir("cards", os.ModePerm)
// Get the data
resp, err := http.Get(url)
if err != nil {
fmt.Println("Error while downloading", url, "-", err)
}
defer resp.Body.Close()
if fileExists(filepath){
fmt.Println("File exists!")
} else {
// Create the file
out, err := os.Create(filepath)
if err != nil {
fmt.Println("Error while creating", filepath, "-", err)
}
defer out.Close()
// Write the body to file
_, err = io.Copy(out, resp.Body)
if err != nil {
fmt.Println("Error while writing file data", filepath, "-", err)
}
}
}
func fileExists(filename string) bool {
info, err := os.Stat(filename)
if os.IsNotExist(err) {
return false
}
return !info.IsDir()
}
func main() {
getCardData()
}
And in this version I just tried to clean some stuff up, refactored some logic, and it compiles, but fails to download anything. It seems that upon debugging it fails at: card_img_url_exists returns bool false, so it doesn't download. I'm guessing because for some odd reason, goquery fails to fetch url into doc. Anyway, if someone knows golang and can help me, cool. Otherwise I might try python and beautiful soup for an mtg card scraper.
- Code: Select all
package main
import (
"fmt"
"github/PuerkitoBio/goquery"
"io"
"log"
"net/http"
"net/url"
"os"
"strings"
)
func getCardData() {
args := os.Args
if len(args) < 1 {
fmt.Println("Usage: mtg Name Of Card");
os.Exit(1);
}
name_of_card := os.Args[1] // name of card from command line
fmt.Println(name_of_card)
_url := fmt.Sprintf("scryfall/search?q=%v", url.QueryEscape(name_of_card))
fmt.Println(_url)
doc, err := goquery.NewDocument(_url)
fmt.Println(_url)
if err != nil {
log.Fatal(err)
}
// Find the review items
doc.Find("a.card-grid-item-card").Each(func(i int, s *goquery.Selection) {
card_img_url, card_img_url_exists := s.Find("img.card").Attr("src")
card_name := s.Find(".card-grid-item-invisible-label").Text()+".jpg"
if card_img_url_exists {
fmt.Printf("%v : %v\n", strings.TrimSpace(card_name), strings.TrimSpace(card_img_url))
downloadFile(card_img_url, "cards/"+card_name)
}
})
}
func downloadFile(url string, filepath string){
os.Mkdir("cards", os.ModePerm)
// Get the data
resp, err := http.Get(url)
if err != nil {
fmt.Println("Error while downloading", url, "-", err)
}
defer resp.Body.Close()
if fileExists(filepath){
fmt.Println("File exists!")
} else {
// Create the file
out, err := os.Create(filepath)
if err != nil {
fmt.Println("Error while creating", filepath, "-", err)
}
defer out.Close()
// Write the body to file
_, err = io.Copy(out, resp.Body)
if err != nil {
fmt.Println("Error while writing file data", filepath, "-", err)
}
}
}
func fileExists(filename string) bool {
info, err := os.Stat(filename)
if os.IsNotExist(err) {
return false
}
return !info.IsDir()
}
func main() {
getCardData()
}
You can download the linux x64 binary here:
EDIT: It won't let me post a link to google drive where I've uploaded the binary as this is my first post and it considers it spammy.
Save the mtg binary anywhere, (/usr/local/bin/mtg works well) to run globally:
- Code: Select all
mtg "Card Name"
or you can run it inside any active directory like this:
- Code: Select all
./mtg "Card Name"
You can also take the
- Code: Select all
standard_all.txt
or any of the txt files in
[url]github/magarena/magarena/tree/master/cards[/url]and run the following command in a bash shell
(if installed in /usr/local/bin)
while IFS="" read -r p || [ -n "$p" ]; do mtg $p ; done < standard_all.txt
(if running in current directory)
while IFS="" read -r p || [ -n "$p" ]; do ./mtg $p ; done < standard_all.txt
And it will use cardname as search term on scryfall, get a page of all the cards for that keyword/cardname, and download the images as 'Card Name.jpg' instead of /cards/normal/front/1/3/13f4bafe-0d21-47ba-8f16-0274107d618c.jpg?1562782879
It will generate a log like this:
- Code: Select all
Zurgo Helmsmasher.jpg : scryfallurl/cards/normal/front/1/3/13f4bafe-0d21-47ba-8f16-0274107d618c.jpg?1562782879
If the file exists, then it will show the above message with the following:
File exists!
The reason I tried to clean up my code earlier was because when I compiled the first binary, I forgot to urlencode the card name, so anytime there is a space in a card name, it doesn't replace ' ' with '+', so scryfall only gets the first word in the multi-word card name.
Happy side effect of this is that it uses the first word as the keyword, and that not only matches the card you're looking for on scryfall, it also matches any cards that may have that string anywhere in the card name.
So this way, if on subsequent requests a certain card was already downloaded, it will inform you that it exists and not bother downloading it or saving it.
Within an hour I was able to download 6k images using only standard_all.txt standard edition cards.
It will show you the card image url, and the filename it was saved as. I came to this conclusion to save cards as 'Card Name.jpg' because in the Magarena folder I saw exactly that. /images/cards/ contained cards by their card name/title.
I've got /cards populated with card images from scryfall, named after the card name. Now, I just don't know how to use them. I mean, I drop them in the /cards folder in Magarena images folder, but when I play the game, no cards show up.
Can you advise how to manually download images like this and where to put them and how to get the card artwork to display while playing Magarena.
Thank you.
I hope someone finds this useful. Once the card images have been downloaded, I assume they can be used with any MTG engine that uses the format 'Card Name.jpg'
Cheers!
P.S I had to remove all the hypertext links etc. from the post otherwise it won't let me post. Just fill in the proper domain names and you're good to go.