Ticket #991 (closed task: fixed)

Opened 3 years ago

Last modified 12 months ago

Test Plumi Installer / Buildout on Latest Stable Debian Release - fix any errors so it will install and run

Reported by: anna Owned by:
Priority: blocker Milestone: 4.5.2
Component: Architecture Severity: New Ticket
Keywords: Cc:
Who will test this: And

Description (last modified by anna) (diff)

the installer is broken for latest debian/ubuntu systems due to python

packaging updates (with setuptools etc).

from the email list:

I thought you'd be interested to know that the main howto for installing Plumi, given here, is now non-functional. The root cause is that the buildout relies on getting files from, which seems to no longer host the correct files.


Change History

comment:1 Changed 3 years ago by anna

  • Description modified (diff)
  • Summary changed from installer is broken for latest debian/ubuntu systems due to python > packaging updates to Test Plumi Installer / Buildout on Latest Stable Debian Release - fix any errors so it will install and run

Debian 8 Jessie released - April 25th, 2015. Unless another stable version has emerged - please test on that.

Sam did this today - "buildout got fixed just before lunch .. seems there is a latest version of varnish that get installed if you update .. I pinned it back to the previous version in the buildout and all went ok"

comment:2 Changed 2 years ago by anna

This is pretty much ready to go.

comment:3 Changed 2 years ago by anna

Actually this is not quite ready to go - this ticket is awaiting documentation of where it is at.

comment:4 Changed 2 years ago by sam

Progress to date:

Installation of Plumi is almost intractable due to issues with old dependencies. Here are my notes so far:

Using these instructions as a guide: and hence also:

Modify the instructions with the following:

  • also add install of libssl-dev to system preparation
  • make sure site.cfg is edited to your suit your local preferences
  • I needed to run bootstrap && buildout twice
  • don't run bootstrap in ffmpeg sub-directory: just do the buildout later (see below)
  • pre-install libjpeg62-turbo-dev instead of libjpeg62-dev

... UPDATE: due to changed dependencies in the the bootstrap script in this branch now fails - as per #993 this is the solution: "The key seems to be installing the 'distribute' package version 0.6.49 manually as per"

Download the source tarball, uncompress it, then run the install command::
$ curl -O
$ tar -xzvf distribute-0.6.49.tar.gz
$ cd distribute-0.6.49
$ python install
  • then (as usual) run ../python/bin/python ./ (using the original Plumi and then ./bin/buildout -Nv


  • run the buildout for ffmpeg:
$ cd ffmpeg/
$ buildout -v

fixing uWSGI

There are a couple of problems with the transcode daemon not reporting errors in enough detail and/or breaking instead of reporting errors. Once I got past those, it seems that ffmpeg is failing with "Invalid data found when processing input". This turns out to be due to uWSGI segfaulting.

This issue is the similar to to this:

... namely, buildout gets confused and builds uWSGI with the wrong XML library in places.

Solve this by removing libxml2 from the system (which necessarily removes things that depend on it: libxml2-dev, libxslt, libxslt1-dev, etc. .. - keep track of these if you need to reinstall them), moving the old uwsgi build and re-running the buildout. Thus uWSGI is built using the lxml that Plumi created, instead of a mixture of system and Plumi lxml libraries.

This causes a problem with the nginx build, so reinstall libxslt1-dev and re-run the buildout again, and finally it works:

sudo ./bin/supervisord

plumi@dream:~/$ ./bin/supervisorctl status
cache                            RUNNING   pid 6738, uptime 0:00:08
nginx                            RUNNING   pid 6739, uptime 0:00:08
ploneftp                         RUNNING   pid 6737, uptime 0:00:08
transcodedaemon                  RUNNING   pid 6736, uptime 0:00:08
uwsgi                            RUNNING   pid 6734, uptime 0:00:08
worker                           RUNNING   pid 6735, uptime 0:00:08
zeo                              RUNNING   pid 6733, uptime 0:00:08
  • Currently we have a new problem when trying to start / restart transcoding:
/home/plumi/ UserWarning: DirectoryView plumi_content_custom_templates refers to a non-existing path 'plumi.content:skins/plumi_content_custom_templates'
  (, reg_key), UserWarning)
2015-08-20 10:04:31,724 WARNI [collective.transcode][uWSGIWorker2Core0] transcode entry timed out: (29bdb70b1220445c8169a51b4537aa19, video_file, jpeg, bb516947768fbb05b41a2487f200716e)
Traceback (most recent call last):
  File "/home/plumi/", line 90, in __call__
    app_iter = self.application(environ, replace_start_response)
  File "/home/plumi/", line 24, in __call__
    result = self.application(environ, save_status_and_headers)
  File "/home/plumi/", line 106, in __call__
    return self.application(environ, start_response)
  File "/home/plumi/", line 282, in publish_module
    response = _publish(request, 'Zope2')
  File "/home/plumi/", line 205, in publish
  File "/home/plumi/", line 77, in mapply
    if debug is not None: return debug(object,args,context)
  File "/home/plumi/", line 46, in call_object
    result=apply(object,args) # Type s<cr> to step into published object.
  File "/home/plumi/", line 13, in __call__
    res = tt.add(self.context, force=True)
  File "/home/plumi/", line 154, in add
    job = async.queueJobWithDelay(None, temp_time, transcode_request, obj, fieldName, UID, payload, secret, address, profile, options, portal_url)
  File "/home/plumi/", line 139, in queueJobWithDelay
    func, context, *args, **kwargs)
  File "/home/plumi/", line 127, in queueJobInQueueWithDelay
    queue = self.getQueues()[queue]
  File "/home/plumi/", line 100, in getQueues
    db = getUtility(IAsyncDatabase)
  File "/home/plumi/", line 169, in getUtility
    raise ComponentLookupError(interface, name)
zope.component.interfaces.ComponentLookupError: (<InterfaceClass>, '')

comment:5 Changed 12 months ago by anna

  • Status changed from new to closed
  • Resolution set to fixed

These issues are now resolved.


Add a comment

Modify Ticket

as closed
The resolution will be deleted. Next status will be 'reopened'

E-mail address and user name can be saved in the Preferences.

Note: See TracTickets for help on using tickets.