Your comments

Dear Mateus,

Unfortunately these data are not available. Back then data from 3.6m (or 1.54m) was not ingested yet in the ESO Archive.

Sorry we cannot help you.

Dear User,

thanks contacting us.

There are no CCV files for radial velocities measurements.

The closest files names are CCF files.

You can find this type of files for examples inside the Ancillary HarpsTar file (TAR file containing several types of files, both FITS and non-FITS.)

You can get an Ancillary files in this way: assuming that you have the ADP name of the file in which you are interested (example ADP.2014-09-16T11:06:15.853), you can go to the Science Portal, query for it, reach the dataset page information

and from here click on the blue button "Dataset Download" that will open the page of the download, and hence you reach the Download portal, where you can click on

"Include ancillary data", and hence select the method for downloading the data that you prefer. Once you download the file, you can extract the ADP and the ancillary file from the archive you downloaded. The ancillary file will be in a tar file that you need to decompress.

The additional constrain only removes entries with null values, when

something went wrong during the observation. We are fixing the behavior,

so in the next days, also the initial query will work.

You are not loosing results since the GIRAFFE spectrograph used in the MEDUSA multi-object spectroscopy mode, allows the observation of up to 130 distinct targets simultaneously.

Hence, each science raw file generates N products, where N corresponds to the number of science fibres. N can be as high as 130.
Also, for those observation blocks where more than one exposure exists, the individual 1D spectra are stacked and also available as products.

For more information please check the products documentation:

and the instrument pages and manuals available through


to avoid the error, please add the following constrain:

AND dec BETWEEN -90 and 90

in this way you get 7 files as result.

The reason is that the raw database is not as curate as the products one, we are going to look into this, thanks for reporting!

Dear user,

There are two different options to upload the list of targets: 

  • You can upload your list of targets here to ensure that the cutout option will be already pre-selected in the Download Portal when downloading the data. (option number one in screenshot)

  • If you upload your target list from the main window, the functionality is the same except for the need to select the cutout option in the Download Portal explicitly. Data volumes displayed in the download portal reflect the data reduction thanks to cutouts in an approximate fashion.

You can get more information clicking on on the scissors as shown in the screenshot below (option number 2), and more extended explanation if you click on "see dedicated help topic" (option number 3) on the displayed window.

Once you are in the download portal, regardless of targetlist or single target, one reason for the problem you're describing comes as a result from any search of (1d) spectra by position (+radius) because there is nothing to cutout from the 1-d spectrum based on the given positional constraint. Therefore the cutout service is disabled (=greyed out).

I hope this answers your question.

Best regards,

Archive Science Group

Dear Andrew,
The information that you should use the datalink service to get to the ancillary files is provided in the programmatic overview page: but I fully agree that that is not enough. I will publish a script for that.
Thanks for pointing this out. In the meantime...

Here I provide you preliminary snippets of the code necessary to do that, with explanations.
As said, it is the datalink service that allows you to find all kinds of files related to the input one.The datalink response for a HARPS calibrated spectrum contains, among others, the ancillary file you want to download (the tar ball).
As an example, try:

The ancillary files can be identified by the "semantics" field which must be set to "#auxiliary".
In python:
import as eso

# The
# is published here:

# Let's get the access_url of 3 HARPS products:
query = """SELECT top 3 access_url from ivoa.ObsCore where obs_collection='HARPS'"""
res =


# Let's loop through those 3, and for each of them loop through its #auxiliary entries (the tar ball is the only #auxiliary for an HARPS product anyway):
for rec in (res):
datalink = vo.dal.adhoc.DatalinkResults.from_result_url(rec['access_url'], session=session)
ancillaries = datalink.bysemantics('#auxiliary')
for anc in ancillaries: # for each ancillary, get its access_url, and use it to download the file
# other useful info available: print(anc['eso_category'], anc['eso_origfile'], anc['content_length'], anc['access_url'])
status_code, filepath = eso.downloadURL(anc['access_url'], session=session)
if status_code == 200:
print("File {0} downloaded as {1}".format(anc['eso_origfile'], filepath))

The result is:

File HARPS.2006-08-09T05:48:52.136_DRS_HARPS_3.5.tar downloaded as ./ADP.2014-09-16T11:08:02.037.tar
File HARPS.2006-01-30T08:42:04.135_DRS_HARPS_3.5.tar downloaded as ./ADP.2014-09-16T11:04:44.533.tar
File HARPS.2006-07-30T07:45:53.333_DRS_HARPS_3.5.tar downloaded as ./ADP.2014-09-16T11:04:48.567.tar

If you prefer to download the tar ball with its original name, then add "filename=anc['eso_origfile']" as in:
status_code, filepath = eso.downloadURL(anc['access_url'], filename=anc['eso_origfile'], session=session)
and you'll obtain:

File HARPS.2006-08-09T05:48:52.136_DRS_HARPS_3.5.tar downloaded as ./HARPS.2006-08-09T05:48:52.136_DRS_HARPS_3.5.tar
File HARPS.2006-01-30T08:42:04.135_DRS_HARPS_3.5.tar downloaded as ./HARPS.2006-01-30T08:42:04.135_DRS_HARPS_3.5.tar
File HARPS.2006-07-30T07:45:53.333_DRS_HARPS_3.5.tar downloaded as ./HARPS.2006-07-30T07:45:53.333_DRS_HARPS_3.5.tar

Thanks a lot for reporting the absence of examples on this!

Alberto Micol

ESO Archive Science Group

The walk-through on how to produce the Figure above using the programmatic access to the catalogue is illustrated in this forum entry (it requires the installation of the un-official ESOAsg python package).