I use Eclipse with PyDev as my IDE of choice. Um, if you do not know what that means, this page is not for you. And in fact, even if you know what that means, this page is, still, most likely, not for you.
I have too many program files stored in too many modules, the majority of which I have (absolutely) no intention of ever running again. And it's starting to clutter up the place. Meaning, it's time for me to put all the old programs on ice. And for me, that means posting them to the web.
Thus, here it is, my old and unused code. I do not warrant it. I certainly do not license it. But given all that, I can't say I particularly care what anyone else does (or does not) do with it.
© Copyright Brett Paufler (all rights reserved), the lot.
Anyway, as follows, please find Python Code saved as plain text files (as opposed to executable .py files), arranged in reverse random order (whatever that's supposed to mean) based on what got posted when and where for reasons unknown.
None of this is intended to be particularly helpful to contemporary (i.e. modern age) humans, which makes it in perfect conformance with the rest of my life.
And, no. I'm not going to be doing some final refactor prior to posting. This was all good enough at one time. So, it is most certainly good enough for the garbage bin. But I will go through it all (sort of, kind of) one more time just to make sure I haven't said something stupid.
No Warranty
No Permissions
No License
No Nothing
And once again, if that's the sort of thing you are looking for, you are likely in the wrong place.
One further note (perhaps of many more to come), I am not updating the information contained within the files (as I do see this, as primarily of interest to The Committee on Archaeological Code), so some contain outdated information, including domain names which I no longer own or email address which I no longer monitor.
A second further note, the dependencies worked at one time. But as this was before I realized external updates would kill my code, I did not track such things or make any provisions for it.
CIV IV BAT/BUG Logfile Graphing Utilities
It's highly unlikely that I will load the game again (bad eyesight and all). So, it's time to clear this stuff out of The IDE. Many of the graphs in the CIV IV Spur were created with the following scripts using a BAT/BUG Log as the input.
CIV Files Read Me: For those who want a few more words.
CIV Research: Outputs Research Order of Techs... among other things.
CIV Combat Graph: This one is a winner.
CIV Graph Commerce: Gold, Beakers, Espionage, etc.
CIV Compare Research: Graph showing relative research from two different games.
CIV Parse Logfile: A script which seems pretty similar to the one two above.
CIV Parse Tech Tree: Path and Cost to get from Here to There in The Tech Tree.
CIV Workspace: Scratchpad Devoid of Detail.
CIV IV Logfile Parse: Abandoned Refactor.
Arch Linux Install: An abandoned write up, preserved for posterity on account of vanity, vainglory, and other nebulous concepts of the like.
Not Vetted! Do Not Use! Posted as an Archeological Record... at best! Disclaimers which should be extended to all other code posted on this site... on the theory that
A Disclaimer Anywhere is a Disclaimer Everywhere.
Cleaning Up
The Image Filters
Sound Check: Cracking Open a Sound File for the first and only time.
PIL Filter Effects: I never much cared for PIL (Python Imaging Library). It's about time I removed this code from my IDE.
Color Highlight: An Abandoned Project. I do not anticipate posting/pulling Img in this go round. Most of my newer filters require a personal library (namely, Img) to function.
Circle Fill: Fills Image with Biggest Possible Circle until no longer possible. A full write-up can be found elsewhere.
Word Tile: Using ImageMagick to achieve a Matrix like effect. Broken, as stands. But it worked at one time. Of course, Matrix Like is a bit of an overstatement. So, let's just say it was Matrix Inspired and call it a day.
Civilization IV - Beyond The Sword
CIV WB Save To Dict: Transforms a CIV IV World Builder File into a Dictionary.
CIV WB Save To PNG: Takes preceding Dictionary and Outputs a PNG.
CIV Crack Save Game Files: Broken. But I am led to believe "789c" is an important part of the solution.
TXT from HTML: A One-Shot Script I should have deleted a long time ago.
CIV Plot Culture: I was trying to understand how Cities exert Culture over Tiles.
Maybe, I will refactor my CIV IV Graphing Utilities, you know, before too terribly long. But for now, I'm just cleaning up, giving myself room to think.
IDE Kill File Notes: Chaotic Notes regarding The Reorganization of My IDE, as made prior to settling upon this spur (Dead Code) as the resolution and repository of choice. It's a step on the path, rapidly fading back into the mist from which it arose. Or if that's not clear. The theme of this page is largely
Things I've Removed From My IDE and this file (if not code) is, certainly, one more such thing.
Email & Notes
Utilities which take an Email or Text Note as input, typically outputting a Semi-Well Formatted Web Page. For the most, these have been reformatted into a different directory, comprising my current working set.
en_workspace: Every directory (of mine) has one. And if we are posting the directory...
en_webpage_book: Formatted Email In - Formatted Webpage Out.
en_webpage_ideas: Per Previous. Idea Type Pages morphed into The Journal Project.
en_webpage_diary: The same concept.
en_webpage_skip_skim: These are basically cut and paste scripts; hence, the refactor.
en_email_to_ideas: An older version of the similarly named (above).
en_notes_image_extractor: Saves Embedded Images from Emailed Notes Files. Principally used for sketches. See various Art Write-Ups. I may want to add this one back into my working set.
en_email_to_text: Extracts Text.
en_utilities_email: Gets Text From Emails.
en_utilities_text: Processes Text (i.e. un-encodes) & Pre-Formats (i.e. re-encodes).
en_utilities_txt_files: Pre-Formats Text Files.
Note: I typically rename scripts upon archiving. Thus, some degree of name-unscrambling will be required for the imports.
sms: The start to an abandoned project intended to extract text messages, previously saved in HTML Format.
ASCII_ART_HEADERS: Takes .flf (i.e. figfont) files as input and outputs simple Banner Headers, wherein each letter is composed from multiple symbols.
magazine: disposable script used to collate individual files in The Magazine Project.
The End of the First Pull
Since this page runs from the bottom to the top (certainly, the code links do: i.e. that is the order of their posting, prior entries being further down the page and subsequent entries being stacked on top), this note marks The End of the First Pull.
Chicago Crimes
I'll sneak this in as my last entry before The End of the First Pull. This project was intended to visually map crime in Chicago (by using GPS coordinates as found in the arrest records). It's been sitting on my desk with no action for two years. And it is not what I want to work on next. So, it's time for it to go. Among the reasons it has been on hold is that it will be a pain to categorize the Crime Codes. And I've never really settled on how I want to display the data. There are so many points, it will get incoherent quite quickly. Anyhow, the raw data ate a gig, the reduced file 600mb. And that's just too much to be sitting around on my desktop, waiting indefinitely.
crime_create_yearly: reduces and separates a very large data file into manageable chunks.
Dungeon & Mazes
Having solved the problem once, I tried again using Trees. But I abandoned that attempt. The first three work great. See the write-up for more information.
maze_output: the main for these first few modules. Creates PNG's of Grid Dungeons.
maze: main logical control.
maze_squares: the building blocks for the maze. The grid sub-structures.
As follows are the same thing (possibly incomplete), utilizing trees.
maze_tree_dungeon: Hey! Hey! My notes say that it works and outputs a valid image. Try it, today.
maze_tree_genetic: some genetic logic: i.e. controlled randomization, using Genetic Algorithms.
maze_tree: contains the base
Node
and
Tree
Classes.
maze_tree_draw_graph: creates (I assume) some sort of graphical output using
networkx
.
color_shebang: creates ropey sort of Perlin Noise. I like. But I'd just use a pre-packaged solution these days.
Imgur
I built an image downloader for Imgur. I don't think I ever used it. It's not that good. And will need lots of help, making it completely useless for anyone but code junkies and archaeological explorers.
imgur_main: saves information about the front page... or so, my notes say. Note: This does not use the API. I assume there is an Imgur API. So, I am sure there are better and friendlier solutions out there.
imgur_crawl: is a downloading utility for the previous... among other things. Meaning, I should have used more modules. But this was back in the day. So, what are you going to do?
imgur_logic: another short utility file for this project.
imgur_pageup: the start of an abandoned Heads Up Display.
Outside of data collection (which I don't think I ever got around to doing), I decided the already existing Web Browsers were much better than anything I was going to come up with.
web_download_sequential: downloads from websites sequences of resources of the form
name_01.pdf
to
name_99.pdf
.
Commodities
At one time, I was way more interested in the trading price of corn and other commodities than I am now.
SQL Based
This was the second attempt.
grain_db_create: makes a SQL database.
grain_db_query: queries said database.
grain_db_trades: makes a graph for a write-up
Pandas Based
In retrospect, there was no reason to create the SQL code. But I haven't worked much with SQL, so I am sure that was a motivating factor. Anyway, these final four code scripts utilize
pandas
data-frames.
grain_make_df: creates the data-frame from raw csv data-files. Since the script references oil and oil was one of the last commodities I looked at, I'm guessing I massaged this code to meet my needs as I went along. This would have been long before I learned to use
CONTROLLING_VARIABLES
, such as
CURRENT_COMMODITY = 'oil'
or the like.
grain_graph_daily: from raw data to a graph of daily closing prices in one simple script.
grain_graph_high_low: starts with a pickle of the data-frame. Um, I don't know where or when the pickle was created. But I'm sure it's somewhere in the code.
grain_graph_roll_ave: more graphs. There's a lot of data munging for those graphs. But at least, it is step wise... long winded, but simple step by step.
Commodity Data
I was going to save this Daily Price Data. It weighs in at a mere 5.61mb (1.22mb compressed). And as such, it's not taking up that much space. But I can't see ever using it again. So, away it goes. More insightfully (and/or more importantly), I've got another data set that's been sitting on my desktop, which weighs 566mb. I really should do something with it: reduce it to the important bits, save it to disc, or something. I need to be more disciplined about these things.
Website Maintenance
I haven't looked at this code in years... let alone, used it. I present them in Alphabetical Order (as opposed to some of the preceding and/or following groups, which are conceptually organized), as these all appear to be stand alone scripts. The common feature being that at one time I used them to solve problems in connection with maintaining my websites.
delete_duplicate_image: checks name and hash, deleting duplicate images if they are the same, used in conjunction with compressing multiple copies of my websites, as I am want to do in order to time-stamp when I posted something. But now (when this is relevant), I use a .wim compression algorithm, which handily solves the problem.
html_remove_style: removes those silly tag style notations that MS Word or similar insists on adding. But I no longer convert from .doc to .html (rather, I copy/paste the relevant text), so I haven't used in years.
html_links: offline checker (and maybe, corrector) for my website. I just use Notepad++ for such things these days, as it can modify all files in a directory (along with any corresponding sub-directories) all at once.
html_rant_format: a specialized formatter, which I no longer use and forgot that I had. I have no idea as to the input... likely MS Word .html outputted files. My entire tool-chain is different, these days. This truly is Dead Code.
html_src_rename: another mass effect .html tool, which I no longer use. This one's purpose was to insert './images/' before any image source, so I could reorganize my website directories. Things were getting messy. Also, some of my early image names were entirely too long:
'This Page - This Section - This Shoot - Identifying Name of Needless Complexity - 01.jpg'
. So, this shortened them. At the present, I have absolutely no intention of cleaning up the old stuff on this site. And if things get too complicated, I'll simply start a new Spur.
Reddit
As I recall,
praw
provides Python bindings for the Reddit API, which in turn, I used to scan the incoming feed and select portions of the site (i.e. subreddits) to look at further. The link in this section's header (i.e.
Reddit) leads to the full data-based write-up for this project. I don't believe those pages include any of the following code.
reddit_stream_view: captures the incoming feed (what I call the nozzle, the latest user input) and saves to csv.
reddit_update_manifests: tracks subreddits. Um, the documentation really sucks on most of this Old Code. Sorry, I have only myself to blame. When it was in use, there was no need. And now, I can't be bothered.
reddit_download_images: should be self explanatory. I am sure it requires a manifest from one of the previous two to function.
reddit_offline: assembles data in a
pandas
data-frame.
reddit_thousands: pulls the top 1,000 Front Page Submissions at the moment.
reddit_text_test: an initial attempt to tokenize text, if I remember correctly. Certainly, such things as rating speech for empathetic intent have interested me at various times in the past.
reddit_web_fp_summary: one-shot code used for a write-up regarding the front page. Vulgarity (i.e. curse words, swear words, and so on) abounds, as that was part of what I was searching for and trying to understand.
reddit_web_graph_subs: creates graphs for the subreddits... or so I presume.
reddit_web_summarize: mungs data in pursuit of charts and graphs.
I should note somewhere: commonly, all of my code resides in a single directory. The organization is flat.
family_tree: determines familial relations from a properly formatted list of tuples.
[('Name', 'Sex', 'Mommy', 'Daddy'), ('Name', 'Sex', 'Mommy', 'Daddy'), ... ]
Craig Reynold's Boids
This was fun. I enjoy Grid Games and/or Agent Based Problems... this being an example of the later, while Conway's Life is an example of the former.
boids: code.
mandelbrot: image creator using numpy and hardwired values. I believe this code can be found elsewhere on the site... say, in conjunction with my Mandelbrot write-up. But at the moment, I cannot be bothered to check or provide a link.
fibonacci_trampoline: I am sure a 'trampoline' is some sort of programming construct. But I no longer remember the particulars. Let us assume (or at least, hope) any trampoline remembers previously computed values and that is it's saving grace. But at this point, I really have no idea.
list_parser: explores trees and the like, utilizing nested lists, no doubt.
vector: there is no reason to believe there is any overlap between my conception of a vector and the mathematical conception of a vector.
flea_jump: as I recall (ever so vaguely), this was for Project Euler #213. Sadly, I do not recall this being a working solution.
hash_work: more throwaway code.
bisect_heapq_test: total throwaway code.
bitwise: shift operators (and the like) are things that I seldom use. I am not a Byte Level Guy.
I did two version's of
The Zebra Puzzle. But as I don't know how much of the solution I cut and pasted (it may have only been the original problem set), I will be omitting that bit of code from this (or any) posting. So, see? Not everything is getting posted.
conway_game_of_life: with GIF output,
eight_queens: solves the eight queens chessboard problem.
fold_closure_test: implements a Haskell style
fold
in Python... or so I am willing to assume rather than digging into the code.
fizz_buzz: it sort of amazing how long it took me to have an organic use case for the
mod
operator. Though, not knowing what it was or how to use it might have had something to do with that.
change_maker: how many quarters, dimes, etc.
loan_calculator: payments and that sort of thing.
Scratch Pads
I believe all of the following were done in support of my
Weekly Code Snippets Project. After all, one has to make sure any posted code works.
scratch_class: explores Class Structures.
scratch_closure: having come to some sort of understanding of both closures and @ function decorators, I can assure you that I would never willing use either... unless they came prepackaged by another. It's just not how I think.
scratch_dict: totally worthless code.
scratch_format: not using any of this very often (text formatting, in this case), the information fades.
scratch_function: for the most, I avoid lambdas, as well.
scratch_sys: notes are not as helpful if one was being sarcastic when they made them.
scratch_turtle:
turtle
is not working code, it is learning code.
scratch_turtle_two: some of us are slow learners. Also, I'm pretty amazed that I bothered to post this.
Finally, I should note somewhere, in a typically piece of code, the first two lines of the heading come from the IDE. And I add the third, just to make sure. So, it's more that I am too lazy to erase the second line then I feel the need to list my name twice. Still, I think every piece of code I write starts like this:
Created on May 24, 2016
@author: Brett Paufler
Copyright Brett Paufler
website_ripper: was designed to download and extract information from a very specific website. I do not know if the website exists, anymore. But if you can determine which one of the millions of websites out there this was designed to work with (especially after all the obfuscation I've done), kudos to you. Sadly, prizes will not be awarded. But I will be impressed.
image_resize: which it does. These days, I'd use IrfanView for something like this.
MutaGenetics Football
I've probably sunk more time into this project than any other. Follow the link to see what this is all about... or accept the shorthand version. It's a Football Play Maker, which utilizes Genetic Programming Techniques.
football_logo.png: as it sounds.
football_v1: way too long at over a thousand lines of code.
football_v1_analysis: looks to be a Data Viewer for the above.
The Final Version of MutaGenetic Football is as follows. It was very much a work in progress when I put it on ice.
football_tournament: runs multiple scrimmages, picking the best ones in an iterative fashion. This is the entry point.
football_scrimmage: being essentially a contest... or if you like, a football play.
football_field: creates the football field.
football_roster: manages the team, each player having multiple (second-by-second) preprogrammed moves. It is these moves which are altered genetically.
football_matrix: a
numpy
based attempt to track everything. My notes say I didn't get very far. I've tried to work logic through
numpy
arrays a few times. But it's never really worked out for me.
football_workspace: a place to work out small sections of code, because I am the type of guy who forgets which comes first (x or y) in
mod(x,y)
. Almost every project or sub-folder in my IDE has a
workspace.py
file in it. Most only include a single line of code (as I tend to delete as I go in the workspace file), so there's not much point in posting them.
receipts: a middle step in a long and brittle tool-chain. Using a SnapScan (or whatever those are called, not that it matters, as I no longer have access to one), I scanned my receipts, outputting a pdf. NuancePDF (or whatever, being the name of a proprietary software suite, for which I no longer hold a license) would convert the .pdf to a .doc file, which I would subsequently use as input for this program, which converts all those .doc files to .txt files, extracts date, store, and amount spent, and finally outputs a .csv containing the same. The program was about 80% accurate. Now (instead of using the just described tool-chain, as I no longer have access to a scanner or any Nuance software), I just take photographs of my receipts and enter the data by hand. And truthfully, as that's been getting a bit tiresome, I'm starting to use more and more cash.
doc_to_txt_and_html: using MS Word, converts a .doc file to .txt and .html. I no longer have MS Word installed, so this does nothing for me.
renamer: a throwaway script that I really should have thrown away the moment it had served its purpose. Once again (as in, please see the comment for the next piece of code, as well), I am trusting one of the new programs I've recently downloaded will take its place. Or if not, perhaps I will be forced to put together a better suite. Either way, I have too much junk code laying around. I didn't even know I had this.
flatten_directory: takes a directory tree and outputs all files into a single directory with all files renamed
dir-sub-sub-file.ext
. I haven't used this in a long time. Hopefully, one of the
renamers I've downloaded recently will work in its stead. And if not, I will be refactoring this as a Command Line Utility, before long.
checksum: provides various hashes (checksums) for inputed file. I'd simply use a one-liner at the Command Line, at this point.
batch_7z: runs multiple instances of 7zip. I stopped using this, as it saved me no time. The most interesting aspect of the code was the Command Line Switch Cheat Sheet. But to save myself any copyright headaches, I cut that out. Happily, the same information can be found found in the
7-zip.html
help-file packaged with 7zip.
KIVY
KIVY is a framework (perhaps a gaming framework) similar to TK... or at least, that is how I remember it. I did a little work exploring its capabilities (obviously, not much); and then, moved on.
kivy_notes: made a year after the fact.
kivy_button_work: creates a 10x10 grid with simple effects.
kivy_shape_shifter: calling it a drawing program would be a bit of a stretch. But let's be honest. At some point, getting anything to the screen is a bit of alright.
kivy_wall_bouncer: the name says it all. And if not, the introductory comments seemed pretty clear to me.
hogtest: HOG stands for Histogram of Gradients, which is an imaging analysis term that I do not understand, but which I hacked to make pretty (or not so pretty) pictures. Ironically, this little nugget was the sole entry in a
graveyard
folder in my IDE. I've tried to organize my Dead Code (and working code) numerous times before. Nonetheless, I have high hopes this will be the solution that works, because as I post these (truly remarkable) code snippets, I delete them from my main Code Repository. This truly is a garbage can... which is not to say I'll never want to cut and paste anything from it. Still, that's way more words than this script deserves. So, I shall stop, here.
Computer Scan
I had made two separate Website Scanners, which are good enough that I still use them. So, I thought I'd try to understand my computer's directory tree a bit better and note any changes made to it. Um, I abandoned the project.
comp_catalog: creates a full computer directory listing.
comp_common: a total fail at keeping things
DRY.
comp_compare: compares (or begins work on comparing) two different directory listings.
comp_report: abandoned. I think I may have mentioned this.
comp_tree: also, abandoned. Or so the notes say.
Lottery Analysis
Once again, somewhere on this site is a
Lottery Analysis, explaining why someone would continually bet on a losing proposition and why a State Lottery (the California Lottery in this particular case) might be a better bet (this being a statement of person opinion) than any other legal option. I originally included numbers in the names of these scripts (
lotto_01...
,
lotto_05...
, and so on), so I'll just post them in that order redacting the numbers, as I shorten their names.
lotto_class: includes the base classes: lotteryGame and prizeLevel.
lotto_download_html: retrieves URL's for all games and html pages for draw games.
lotto_download_scratchers: using the results of the previous, retrieves scratcher pages. Note: none of this is expected to work anymore, based on address changes in the CA Lottery Site.
lotto_pickle: scrubs the downloaded html for data and pickles the result.
lotto_benchmarks: creates comparable data for casino games.
lotto_graph_longodds: graph and analysis, as per all of the following unless otherwise noted.
lotto_graph_gambler_ruin
lotto_chart_chance
lotto_chart_rpd
lotto_graph_powerball
lotto_graphs_rpd_chance
lotto_rpd_game_profiler: attempts to prioritize games based upon my personal criteria, given that betting on the lottery is a losing proposition no matter how one plays.
lotto_analyse_game_1109: a scratch pad for CA Game #1109, which looks to be a $1 Scratcher.
Ideally (and in retrospect), the individual files should have been more focused. On the other hand, I find individual (and multiple) files handy when working with graphs.
Dead Code may not the best name for a project started during the Corona Virus.
TK Test Kitchen
I believe all of this was previously posted in a TK Test Kitchen web-page somewhere in
Brett Code. Between the lot, there is an implementation of Proximity (a Mine Sweeper Clone), Risk It (a game inspired by Risk, but not nearly as complicated), and Tic Tac Toe (which I am going to assume needs no explanation).
tk_window_expansion: bare minimum to resize a TK Window.
proximity: a Mine Sweeper Clone.
tic_tac_toe: yes, it is.
risk_it: a human playable version of the game.
risk_it_logic: a Tournament Harness for playing multiple games automatically.
risk_it_graphs: Graphical Analysis used for write-up.
Chess
The next three are all part of the same short-lived project... short-lived from a programming standpoint, anyhow. The project itself stuck around for a little over a year.
chess_pieces_utf: creates the following text file (as in, it's the next link below), as a preliminary for more complicated output, which I never got to.
chess_pieces_utf_output: UTF chessboard symbols for use in other applications, say a Python Program that creates a chessboard image, which, as I've said, I never implemented.
chess_pgn_first_move: extracts the first move from a standard
.pgn
file (not a .png, but a .pgn, a Chess Game Data Structure), along with some other meta-data.
Website Pages
This marks the end (or beginning, if one is reading from the top down) of the website_pages module in my IDE. In the future, rather than storing any code there, I would intend to post it to this page, instead.
Many of these were page generators. And I have another folder filled with those for use in converting email and text based notes to web-pages. At some point, I should refactor that into a more modular system. But that day is not today.
css_notes: another dead file. This was a resource of interesting css configurations. But over the years, I've found it easier to simply look under the hood at my old web-pages, if I wish to duplicate an effect.
figcaptions: given a bunch of repetitive work, I will try to automate it. Here, we have the creation of figcaptions for images from a text file. Even so, there are easier ways to do this... manually, for one. Having completely forgotten about it, I never use this script. But even so, it is far from tempting.
page_sleep_studies: after tracking my sleep for a month or so, I used this script to make a few graphs. There's nothing like collecting data to highlight how tentative all data must be.
page_constitution: a one-shot used to create the graphs and such on my Constitutional Analysis page. It takes as input a text file (formatted just so, I am sure) as input, which I shall not bother to post.
page_test_pattern: creates Perlin Noise adjacent images... or so, I will claim. I'd use a G'MIC one-liner these days.
{Actually, all of the preceeding is in error. Without running the code, I believe this makes Modern Art using lined circles, so an optical illusion type effect, which has nothing to do with Perlin Noise.}
weekly_code_snippets: a worksheet for my Weekly Code Snippets page. Amusingly, that particular project lasted for at least a year... I would guess. But this code repository includes but a single week. And, yes. I do hate throwing out words. Hence, the entire reason for this Dead Code page.
page_light_genetic: the worksheet (to insure I was posting good code) for another page for yet another Lightning Talk never given, this one about Genetic Programming.
page_light_functional: I thought I might give more Lightning Talks, so I made a few more pages. This is the test code for the page for a talk I never did give on Functional Programming. I would now describe Functional Programming as a desire to write code in such a way that the rules for Abstract Algebra hold true. It's a nice starting point. But not an end all.
page_light_img_crack: the practice code used to develop a Lightning Talk about image cracking. The relevant page is on my site somewhere.
page_image_to_table: the given image ("work.jpg" in this case) is recoded as a html table and outputted as a fully formatted web-page.
page_colorwheel: a one-shot program, which creates a html page with numerous color samples... so, highly repetitive work, here. All the programs which start
page_
are one-shot html page creation scrips.
base: utilizing
jinja2
as a template system, this creates a html web-page from the passed values. My templates are not so complicated and
jinja2
is not so helpful that I can ever see using that utility ever again. This script is imported by the next two programs.
doc_to_html: eliminates the crap encoding from the default html produced by MS Word. The highlight (down at the code level) is a working
for
loop.
symbol_table: creates a HTML Table full of the first 10,000 Unicode Characters. Or in other words, it's a simple
for
loop outputted as text. Ah, to be young again, when
for
loops were new and I was happy to automate absolutely anything.
archive_article_html: creates a base html file, which links to previously created html files in the same directory. Hindsight is 20/20. But I would have been better served by having a dedicated html page creator from the get go and referencing ALL page creation needs to the same template. Um, that's not clear. So, let us say, the rudimentary basis of a html page is rewritten countless times in my code base... and is maybe something I should do again, one final time.
archive_article: a one-shot, written so long ago, I cannot remember what it does. By the look of it, it literally does nothing but complete a single find/replace task. Oh, well. Time to kill. I used it for the Archive Article section of my website. A few more like this and there will be no wonder why my code repository got overrun.
poetry_in_motion: converts .txt files to .html, being another hyper-specialized mass-conversion tool for the Poetry In Motion sequence.
twelfth_century: converts .doc files to .html, a one-shot used for posting The Twelfth Century Series.