File talk:1001 Days in Urban Dead.gif: Difference between revisions
From The Urban Dead Wiki
Jump to navigationJump to search
The Rooster (talk | contribs) |
|||
Line 28: | Line 28: | ||
</pre> | </pre> | ||
:i remember why my project didnt went on now, there was so many data from non-suburb danger reports, and formatting the suburb data was kind of a pain cuz some edits would break whatever script i tried running over them. --<small>[[User:Hagnat|hagnat]]</small> 14:24, 17 June 2011 (BST) | :i remember why my project didnt went on now, there was so many data from non-suburb danger reports, and formatting the suburb data was kind of a pain cuz some edits would break whatever script i tried running over them. --<small>[[User:Hagnat|hagnat]]</small> 14:24, 17 June 2011 (BST) | ||
:Yeah I just loaded each revision and appended a line to a text file for that suburb with the status and time. If my script couldn't find the status it just saved the time, which meant I could go over the files afterwards and manually fill in the blanks from borked edits. | |||
:Having older revisions would be great for filling in the missing history, it'd be even better if we could have the whole pre-purge stuff so that everything is covered. {{User:The_Rooster/Sig}} 15:47, 18 June 2011 (BST) |
Latest revision as of 14:47, 18 June 2011
Inclusion
Just keeping the page off unused files since you'd only ever link this directly because the wiki can't handle it. -- RoosterDragon 12:19, 16 June 2011 (BST)
Suburb Data
how did you gathered all the data used to create this gif ? Asked kevan for the diff on every suburb danger page ? I remember doing it once, i gave him the SQL query and he returned me the full history in a zip. Too bad i didnt finished that project, it looked promissing as did this image (which is fantastic, btw) --hagnat 14:12, 17 June 2011 (BST)
- the data i have (and which kevan hosted in a zip somewhere in udwiki, i can give you the link if you wants it) is from april 2006 though sept 2007, gathered with this query here
SELECT wp.page_id AS id, wp.page_title AS page, wr.rev_timestamp AS timestamp, wt.old_text AS text FROM wikirevision wr LEFT JOIN wikipage wp ON wp.page_id = wr.rev_page LEFT JOIN wikitext wt ON wr.rev_text_id = wt.old_id WHERE wp.page_title LIKE '%DangerReport%' AND wp.page_namespace = 2 ORDER BY wp.page_id ASC, wr.rev_timestamp ASC
- i remember why my project didnt went on now, there was so many data from non-suburb danger reports, and formatting the suburb data was kind of a pain cuz some edits would break whatever script i tried running over them. --hagnat 14:24, 17 June 2011 (BST)
- Yeah I just loaded each revision and appended a line to a text file for that suburb with the status and time. If my script couldn't find the status it just saved the time, which meant I could go over the files afterwards and manually fill in the blanks from borked edits.