Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reorganize "files" (binaries) folder and add to repo #51

Open
IvanExpert opened this issue Dec 20, 2015 · 21 comments
Open

reorganize "files" (binaries) folder and add to repo #51

IvanExpert opened this issue Dec 20, 2015 · 21 comments

Comments

@IvanExpert
Copy link
Contributor

Ok, need your help here, read the whole thing:

I'm in the process of reorganizing the A2SERVER files folder (which is mostly binaries) to have more consistent names, e.g. "foo-rpi.tgz" for Raspberry Pi, "foo-debian_x86.tgz" for Debian x86, "foo-debian_x64.tgz" for Debian x64, and "foo-debian7_x86.tgz" for something specific to Debian 7 (e.g. due to some dependency on an older package not supplied in Debian 8).

Since these binaries are as important a part of the install process as the scripts, I want to add them to the repo, excluding the enormous OS distribution files, which will get a separate folder (e.g. "rasppleii/dist"). And then I want these binaries to be supported by the A2SERVER_SCRIPT_URL export. So far, so good, I can do all that.

Here's a problem I don't know quite how to resolve: Several files, e.g. unar-rpi.tgz, nulib2-rpi.tgz, and raspbian-update.txt, are shared by A2SERVER and A2CLOUD. I don't want to potentially have redundant and potentially inconsistent versions in the A2SERVER and A2CLOUD hierarchies. In the bad old pre-Github days, I logged into the web host's shell and simply made symlinks to a single file as a way to avoid this.

However, that isn't gonna fly now. My inclination is to put common files in rasppleii/common/files and then change any links in the scripts to use that. I've already started to do this. However, this poses two problems:

  • As the A2SERVER_SCRIPT_URL shell variable is currently implemented, it would require ugly reach-backwards URL's like
wget -qO /tmp/somefile.tgz ${A2SERVER_SCRIPT_URL}../rasppleii/common/files/somefile.tgz
  • A developer can't clone just the rasppleii/a2server project -- they also have to get rasppleii/rasppleii

The first would seem to be addressable by a more generalized A2SERVER_SCRIPT_URL, e.g. RASPPLEII_SCRIPT_URL, and then the A2SERVER URLs would all need to specify "a2server/", e.g. ${RASPPLEII_SCRIPT_URL}a2server/etc/etc.txt

The second is would seem to be addressable by indicating rasppleii/rasppleii as a dependency which can be requested during the git clone, though I don't know how to do that; or just clear instructions that the developer needs both.

Or, maybe there's another way to maintain consistent binaries in separate hierarchies in Git; if so, I don't know how to do that either.

TL;dr: What do you propose we do to make the 'files' subdirectory part of the repo but also keep things consistent for common files?

(Of course, this also confuses issues somewhat because Raspple II is being used generically as the parent of A2SERVER and A2CLOUD, even for non-Raspberry Pi setups, but that's, I think, just how things are gonna be, e.g. the name of this whole repo.)

@knghtbrd
Copy link
Member

This is a complex problem. First, binary files don't work well in VCS in general, since each revision of the file must be downloaded. Moreover, every time you clone a DVCS repository like github, you will download every single version of every single binary in version control. That means if you WERE trying to put the multi-gigabyte VM images in traditional git hosting, you could find yourself trying to download terabytes within a year of releases! There's a git largefiles extension for getting around this, but the most recommended thing to do is store your source code in VCS and compile the binaries with checkout hooks (including cross-compiling.)

But that's still a big problem, and one of several inter-related problems. The first is a relatively small bite, but it sets some groundwork in motion: The common scripts (and their dependencies). The scripts common to both A2CLOUD and A2SERVER appear to be simply raspbian-update and cppo. Let's deal with cppo first.

First, it doesn't make sense to have cppo by itself. There are other scripts that do go well along with it: acmd, dopo, mkpo, dos2pro, and shk2image seem like they go with it. Seems like the dependencies for those tools are AppleCommander and nulib2.

Is AppleCommander even being maintained anymore? It appears that it may not be anymore, but that might just be that not much has been needed since Google Code shut down. We've discussed that it's kinda weirdly split between Google Code and SourceForge before.

If it's kinda orphaned upstream, we might just fold it along with the scripts into a new repository, which is one of the things you're suggesting. That removes some duplication from both A2SERVER and A2CLOUD.

The other shared script is raspbian-update which I have always had some mixed feelings about. I'd suggest leaving it alone for the time being and just accept that there are two copies for now. I'd rather there be only one, and I'd rather it not be part of either A2CLOUD or A2SERVER in the end. But I'm much better able to sell myself on the idea of removing from both once a lot more stuff has been done toward doing things the Debian way.

Before I blather on about potential solutions to anything else, I'll stop here so that we can make sure we're in the same book, to say nothing else about on the same page. :) How's my thinking thus far?

@knghtbrd
Copy link
Member

I'm building a full dependency list for A2SERVER and I'm a little confused by some of the logic used in picking packages for wheezy/jessie for netatalk:

        if [[ $isJessie ]]; then
            if [[ $isRpi ]]; then
                getOldPackage "http://mirrordirector.raspbian.org/raspbian/pool/main/d/db/libdb5.1_5.1.29-5_armhf.deb"
            elif [[ $isDebian_x86 ]]; then
                getOldPackage "http://ftp.debian.org/debian/pool/main/d/db/libdb5.1_5.1.29-5_i386.deb"
            fi
        elif [[ $(apt-cache search '^libdb4.8$') ]]; then
            sudo apt-get -y install libdb4.8
        elif [[ $(apt-cache search '^libdb5.1$') ]]; then
            sudo apt-get -y install libdb5.1
        else
            break
        fi

I get isJessie there—getting the packages from wheezy if they're not installed so that you can use the older netatalk 2.2.4 binaries without recompiling them. What's the libdb4.8 there for? And preferring it to libdb5.1?

@IvanExpert
Copy link
Contributor Author

I get isJessie there—getting the packages from wheezy if they're not installed so that you can use the older netatalk 2.2.4 binaries without recompiling them.

That's actually not the reason; I already have Jessie Netatalk binaries for both arm and x86, compiled based on the newer packages, and this was my original approach. However, I instead opted to use the older packages because 2.2.4 is specifically qualified for use with them, and I was unsure whether libgcrypt20 and libdb5.3 are specifically supposed to be backwards compatible with older versions. With that said, I don't see any ill effects when compiled with the newer packages, and others have reported success (by default, it will fall back on source which has transitional packages available), so how I'm doing it for now could be revisited.
What's the libdb4.8 there for? And preferring it to libdb5.1?

This might no longer be valid. I think Netatalk 2.1.x, which is what I started with, may have not explicitly supported 5.1, and so my bias was towards 4.8, which I'd seen proven to work and was explicitly supported; and whatever version of Ubuntu (before I switched to Debian) I was using at the time may not have offered 5.1. However, Raspbian didn't offer 4.8, so I installed the later but less proven version if that was the only option. I tend to be very much of the if it ain't broke don't fix it school, and also the updates might or might not cause problems school. However, there's probably no reason to not always install 5.1 at this point if it's available.

@IvanExpert
Copy link
Contributor Author

I'll prioritize lib5.1 over libdb4.8 for the next commit.

@knghtbrd
Copy link
Member

Should I drop 4.8 then? I'll leave you to decide whether to use the
newer Jessie packages otherwise or not.

I've got a pending commit that collapses some apt-get -y install
lines for build-dependencies to one line and explicitly comments that
those are build dependencies. Makes a few lines a bit long, but will
help later on I think.

Joseph

On Sun, Dec 20, 2015 at 07:05:11PM -0800, IvanExpert wrote:

I get isJessie there—getting the packages from wheezy if they're not installed so that you can use the older netatalk 2.2.4 binaries without recompiling them.

That's actually not the reason; I already have Jessie Netatalk binaries for both arm and x86, compiled based on the newer packages, and this was my original approach. However, I instead opted to use the older packages because 2.2.4 is specifically qualified for use with them, and I was unsure whether libgcrypt20 and libdb5.3 are specifically supposed to be backwards compatible with older versions. With that said, I don't see any ill effects when compiled with the newer packages, and others have reported success (by default, it will fall back on source which has transitional packages available), so how I'm doing it for now could be revisited.
What's the libdb4.8 there for? And preferring it to libdb5.1?

This might no longer be valid. I think Netatalk 2.1.x, which is what I started with, may have not explicitly supported 5.1, and so my bias was towards 4.8, which I'd seen proven to work and was explicitly supported; and whatever version of Ubuntu (before I switched to Debian) I was using at the time may not have offered 5.1. However, Raspbian didn't offer 4.8, so I installed the later but less proven version if that was the only option. I tend to be very much of the if it ain't broke don't fix it school, and also the updates might or might not cause problems school. However, there's probably no reason to not always install 5.1 at this point if it's available.


Reply to this email directly or view it on GitHub:
#51 (comment)

@knghtbrd
Copy link
Member

Likewise, drop libssl0.9.8?

@IvanExpert
Copy link
Contributor Author

This is a complex problem. First, binary files don't work well in VCS in general

Duh, don't know why I didn't think about that. Of course not.

but the most recommended thing to do is store your source code in VCS and compile the binaries with checkout hooks (including cross-compiling.)

Eww. I'd rather have checkout hooks to just download the binaries from elsewhere. In nearly every case where there are binaries, the source code does exist somewhere already, and I don't want to replicate it; plus, compiling on every checkout is slow and ugly. In some cases, the binaries are actually things like Apple II floppy disk images, and while theoretically those could be built at checkout with things like AppleCommander, yuck. What perhaps I'll do is make a tarball of the binaries, host it on my web site, and in the developer README it can say where to grab and extract them for local or alternate-server installations with A2SERVER_SCRIPT_URL.

First, it doesn't make sense to have cppo by itself. There are other scripts that do go well along with it: acmd, dopo, mkpo, dos2pro, and shk2image seem like they go with it. Seems like the dependencies for those tools are AppleCommander and nulib2.

Au contraire. There's a good reason to have cppo by itself: a dependency for AppleCommander is Java, which is NOT something I want to require for an A2SERVER-only installation. So I'm just as happy leaving those non-cppo items in A2CLOUD only. (Though I suppose dopo could be included in A2SERVER, as it has no dependencies.)

The other shared script is raspbian-update which I have always had some mixed feelings about. I'd suggest leaving it alone for the time being and just accept that there are two copies for now.

I can live with that as a short term solution for cppo and rasppleii-update, though I dislike the idea of these scripts forking as a result of being in two different projects, which it seems like could easily happen. I'll try to take care to make sure that if I update one, I update the other, but that'll be a hard thing to rely on.

@IvanExpert
Copy link
Contributor Author

Don't drop 4.8 entirely; someone might want to install on an older Linux that doesn't have 5.1. In my next push I'll have 5.1 and 1.0.0 only for runtime, and fall back on 4.8 for dev; and I'll roll a new binary for netatalk based on 5.1 and 1.0.0.

@IvanExpert
Copy link
Contributor Author

I've got a pending commit that collapses some apt-get -y install
lines for build-dependencies to one line and explicitly comments that
those are build dependencies. Makes a few lines a bit long, but will
help later on I think.

I'm hoping I'm gonna be done with my end of things soon; I think my preference would be for you to hold off on pushing your refactoring until I get 1.3.0 out the door, after which you can rework everything as you see fit.

@knghtbrd
Copy link
Member

No serious refactoring, just adding comments and collapsing two adjacent lines into one, like so:

diff --git a/scripts/a2server-2-tools.txt b/scripts/a2server-2-tools.txt
index b7142f8..e5f395d 100755
--- a/scripts/a2server-2-tools.txt
+++ b/scripts/a2server-2-tools.txt
@@ -40,9 +40,10 @@ if ! command -v nulib2 > /dev/null; then
             touch /tmp/a2server-packageReposUpdated
         fi

-        sudo apt-get -y install build-essential
-        sudo apt-get -y install zlib1g-dev
+        # Dependencies: build-dep for nulib
+        sudo apt-get -y install build-essential zlib1g-dev
         sudo apt-get -y clean
+
         cd /tmp
         rm -rf /tmp/nulib &> /dev/null
         mkdir /tmp/nulib
@@ -78,6 +79,7 @@ if ! command -v unar > /dev/null; then
         touch /tmp/a2server-packageReposUpdated
     fi

+    # jessie and later: Just use the unar package
     if [[ $isJessie ]]; then
         sudo apt-get install -y unar
         sudo apt-get clean
@@ -85,8 +87,11 @@ if ! command -v unar > /dev/null; then

     if ! command -v unar > /dev/null; then
         if [[ $isRpi || $isDebian ]]; then
+
+            # Dependencies: for unar
             sudo apt-get -y install libgnustep-base1.22
             sudo apt-get clean
+
             if [[ $isRpi ]]; then
                 wget -qO- "http://appleii.ivanx.com/a2server/files/unar-rpi_wheezy.tgz" | sudo tar Pzx
             elif [[ $isDebian ]]; then
@@ -96,10 +101,11 @@ if ! command -v unar > /dev/null; then

         # If all else fails, compile from source.
         if ! command -v unar >/dev/null; then
-            sudo apt-get -y install build-essential
-            sudo apt-get -y install libgnustep-base-dev libz-dev libbz2-dev
-            sudo apt-get -y install libssl-dev libicu-dev unzip
+
+            # Dependencies: build-deps for unar
+            sudo apt-get -y install build-essential libgnustep-base-dev libz-dev libbz2-dev libssl-dev libicu-dev unzip
             sudo apt-get clean
+
             rm -rf /tmp/unar &> /dev/null
             mkdir /tmp/unar
             cd /tmp/unar
@@ -121,7 +127,9 @@ fi

 if ! command -v unzip >/dev/null; then
     echo "A2SERVER: Installing unzip..."
-sudo apt-get -y install unzip
+
+    # Dependencies: unzip
+    sudo apt-get -y install unzip
 else
     echo "A2SERVER: unzip has already been installed."
 fi

The idea I had was to do that kind of thing for apt-get lines, then go back and do likewise for wget lines so it's very easy to search for the next dependency. It doesn't have to be done now though if you're in the midst of preparing a commit.

Actually in that file, you can see that unzip is conditionally installed to satisfy a build-dep, then installed unconditionally so it's available right afterward. I did plan, after this commit, to move the unzip install above the one for unar and remove unzip from the extra build-deps for compiling unar from source, since it's already a dependency of A2SERVER as a whole. Moving about four lines up a block—not much of a refactor—but I don't want to do anything significant to it until after 1.3.0 is ready.

@IvanExpert
Copy link
Contributor Author

Ok, that all seems pretty reasonable. I expect 1.3.0 to be ready shortly.

@knghtbrd
Copy link
Member

Referring to my thoughts about collapsing apt-get lines a little or a repo for acme and related scripts and all?

I'll comment about the next bit I've got in mind that all seems good.

Joseph
Sent via mobile

On Dec 20, 2015, at 21:28, IvanExpert [email protected] wrote:

Ok, that all seems pretty reasonable. I expect 1.3.0 to be ready shortly.


Reply to this email directly or view it on GitHub.

@IvanExpert
Copy link
Contributor Author

Referring to collapsing apt-get lines, and specifically not referring to repo for acmd and related scripts, other than cppo (and maybe dopo). The scripts that depend on AppleCommander belong in A2CLOUD, because if they require AppleCommander, then they require Java, and A2SERVER should not require Java. (A2CLOUD by definition requires Java because the ADTPro server is in Java.)

@knghtbrd
Copy link
Member

Saw your other replies later (iOS Mail…) I'll stick to the trivial
stuff for now and not move any files for the time being so as not to
slow down 1.3.0 development.

Joseph

On Mon, Dec 21, 2015 at 07:15:37AM -0800, IvanExpert wrote:

Referring to collapsing apt-get lines, and specifically not referring to repo for acmd and related scripts, other than cppo (and maybe dopo). The scripts that depend on AppleCommander belong in A2CLOUD, because if they require AppleCommander, then they require Java, and A2SERVER should not require Java.


Reply to this email directly or view it on GitHub:
#51 (comment)

@knghtbrd
Copy link
Member

Just push that, had a minor merge conflict where you added an apt-get update where I added a comment. Correct solution was obviously to have both.

@IvanExpert
Copy link
Contributor Author

Ok so there are a few issues being bandied about here, let's wrap up:

  • The collapsed and commented apt-get statements have been merged in with my stuff and are in 1.2.8.
  • You're gonna mostly lay off the main scripts until 1.3.0 (which I now want to call 1.5.0) is done.
  • We're tabling the issue of shared scripts for now, and living with duplicated cppo and raspbian-update in A2SERVER and A2CLOUD; we will revisit later. I maintain that I don't think the other utilities belong in A2SERVER due to their Java dependency; I'm all for a rewrite of AppleCommander in Python, but until then, I see no reason for them to have their own repo independently of A2SERVER, since Java is such a pig and you wouldn't want to install it on your system just for these tools (IMO), which have no formal utility in A2SERVER.
  • I've haven't addressed the issue of shared resources, but I've addressed the issue of where to put binaries by creating the A2SERVER_BINARY_URL and A2SERVER_NO_EXTERNAL environment variables so things aren't pinned to ivanx.com. A developer can download binaries in a tarball and add them to his or her A2SERVER development folder for local installation, eliminating the need to store them in SCM.This is further documented in the 1.2.8 release notes.
  • The binaries folder has been reorganized, as detailed in the 1.2.8 release notes.
  • libdb5.1 and libssl1.0.0 are now always preferred but it will fall back on the older versions if those aren't available (e.g. on Debian 6 or an older Ubuntu, though who knows if other things don't break on those).

I'll leave it to you to decide whether this issue should remain open, whether new issues should be spawned out of the above, or what.

@knghtbrd
Copy link
Member

On Wed, Dec 23, 2015 at 09:22:41PM -0800, IvanExpert wrote:

Ok so there are a few issues being bandied about here, let's wrap up:

  • The collapsed and commented apt-get statements have been merged in with my stuff and are in 1.2.8.

Cool, that'll help for future fiddling with them.

  • You're gonna mostly lay off the main scripts until 1.3.0 (which I now want to call 1.5.0) is done.

Correct. I know you have a specific roadmap in mind here and I know
you're still getting used to the tools. We need at least one more
release of A2SERVER before I go into major renovations mode and
there's other stuff to be done. If you could use a hand with
something in the meantime, let me know and I can step in.

  • We're tabling the issue of shared scripts for now, and living with duplicated cppo and raspbian-update in A2SERVER and A2CLOUD; we will revisit later. I maintain that I don't think the other utilities belong in A2SERVER due to their Java dependency; I'm all for a rewrite of AppleCommander in Python, but until then, I see no reason for them to have their own repo independently of A2SERVER, since Java is such a pig and you wouldn't want to install it on your system just for these tools (IMO), which have no formal utility in A2SERVER.

AppleCommander rewrite in Python is a pretty major undertaking
anyway. We might want to revisit things before it is begun. I don't
mind taking on major projects if there's a clear roadmap that can
break the pieces into bite-sized chunks, but we haven't got that for
AppleCommander yet. I wouldn't mind moving cppo and dopo and
anything else that doesn't depend on Java into their own common repo
that would eventually become the home of an ApplePyCommander in the
future, but again I'd like a roadmap first.


That's a common thing with me—take the website repo I've started
working on: Multiple major objectives, particularly for a guy whose
strong suit is NOT web development! But I have a plan for most of
the pieces including those I won't need for quite some time. I
haven't put any of my kramdown templates in DVCS yet because I am
reading over how Jekyll works. The templates don't and won't use
Jekyll, but when I upload them I want them to follow Jekyll's
structure and whatnot so that when it's time to look at using it, the
groundwork is laid.

Of course I haven't actually committed it that way yet because I'm
still determining if Jekyll is the right tool for the job. We're not
really doing a blog here, after all. Eh, there could be a blog I
suppose, but we really need something else and Jekyll appears to be
aimed at blogging. It does more I know, and it doesn't have all the
security crap WordPress and whatnot do, but still…

  • I've haven't addressed the issue of shared resources, but I've addressed the issue of where to put binaries by creating the A2SERVER_BINARY_URL and A2SERVER_NO_EXTERNAL environment variables so things aren't pinned to ivanx.com. A developer can download binaries in a tarball and add them to his or her A2SERVER development folder for local installation, eliminating the need to store them in SCM.This is further documented in the 1.2.8 release notes.

Sounds good.

  • libdb5.1 and libssl1.0.0 are now always preferred but it will fall back on the older versions if those aren't available (e.g. on Debian 6 or an older Ubuntu, though who knows if other things don't break on those).

Do we actually still support Debian 6? (I removed Ubuntu support
some time ago following your suggestion that it either needed work or
removal.)

Issue #2 regarding Debian 6 is that on Raspbian, it's armel which
means our binaries won't work. It also is no longer receiving any
kind of security updates and there are MAJOR nasty security flaws
in very core libraries this year after 8.0 went stable, which means
that things like heartbleed are likely unfixed. Eep!

Ultimately after 1.5.0, my plan is to package the scripts for Debian.
This means the package will depend on the versions of things we tell
it to depend on, which should be whatever's in stable for a stable
release.

The bonus for doing it that way is that if the scripts get to the
point that they don't need to apt-get anything (which is going to
require us packaging our own netatalk or fixing Debian's for 9.0 or
jessie-backports), A2SERVER will not depend on Debian significantly
at all anymore. That is, assuming that you duplicate the dependency
stuff we do for the Debian package. The source package oughtta build
on Ubuntu as-is or with only the slightest alterations. And anything
else under the Linux sun should work if you've got the right stuff in
your kernel.

I don't intend to take the time to make any of that work, but I'm
happy to invite anyone who wants it to work on anything else to bring
their favorite distribution to the party. RH and Arch package
scripts are the obvious choices.

I'll leave it to you to decide whether this issue should remain open, whether new issues should be spawned out of the above, or what.

I'll have a look tomorrow, it's late for me just now given kids plus
Christmas morning in mere hours. ;)

Joseph

@IvanExpert
Copy link
Contributor Author

IvanExpert commented Dec 24, 2015 via email

@knghtbrd
Copy link
Member

That's where you started working on this stuff wasn't it?

Joseph

On Thu, Dec 24, 2015 at 01:52:25AM -0800, IvanExpert wrote:

No reason to still support Debian 6, I'll just drop support for those older packages I. The next commit, maybe I'm just sentimentally attached to them :P


Reply to this email directly or view it on GitHub:
#51 (comment)

@IvanExpert
Copy link
Contributor Author

Around 2010, started with Ubuntu Server, yep

On Dec 25, 2015, at 2:25 AM, Joseph Carter [email protected] wrote:

That's where you started working on this stuff wasn't it?

Joseph

On Thu, Dec 24, 2015 at 01:52:25AM -0800, IvanExpert wrote:

No reason to still support Debian 6, I'll just drop support for those older packages I. The next commit, maybe I'm just sentimentally attached to them :P


Reply to this email directly or view it on GitHub:
#51 (comment)

Reply to this email directly or view it on GitHub.

@knghtbrd
Copy link
Member

Status update: Part of this has been resolved with the binary URLs code, but all of this needs to be revisited still. We still need to revisit the duplication and how things are stored I think, and probably need to address this whole issue twice more before we can call it "done". This week should see things moving from ivanx.com to a flat directory at one of my domains along with infrastructure on ivanx.com to avoid breaking things for existing users, but allow me to update these files as needed.

I'll be working out what needs to be generated, what needs to be packaged, and what needs to be downloaded from there. That's a longer, multi-step process.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants