Skip to content

Commit

Permalink
Rename package to azure-datalake-store (Azure#86)
Browse files Browse the repository at this point in the history
  • Loading branch information
jbcrail authored and begoldsm committed Oct 3, 2016
1 parent 1c1e576 commit 27be9e6
Show file tree
Hide file tree
Showing 30 changed files with 77 additions and 70 deletions.
4 changes: 2 additions & 2 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
[run]
include =
adlfs/*
azure/datalake/store/*

omit =
adlfs/tests/test*
azure/datalake/store/tests/test*

[report]
show_missing = True
Expand Down
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ install:
- python setup.py develop

script:
- py.test -x -vvv --doctest-modules --pyargs adlfs tests
- py.test -x -vvv --doctest-modules --pyargs azure.datalake.store tests

notifications:
email: false
2 changes: 1 addition & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
recursive-include adlfs *.py
recursive-include azure/datalake/store *.py
recursive-include docs *.rst

include setup.py
Expand Down
28 changes: 14 additions & 14 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
adlfs
====
azure-datalake-store
====================

.. image:: https://travis-ci.org/Azure/azure-data-lake-store-python.svg?branch=dev
:target: https://travis-ci.org/Azure/azure-data-lake-store-python

NOTE: This management package is currently under active development
and should not be consumed until the first version is published.

adlfs is a file-system management system in python for the
azure-datalake-store is a file-system management system in python for the
Azure Data-Lake Store.

To install:
Expand All @@ -25,7 +25,7 @@ To play with the code, here is a starting point:

.. code-block:: python
from adlfs import core, lib, multithread
from azure.datalake.store import core, lib, multithread
token = lib.auth(tenant_id, username, password)
adl = core.AzureDLFileSystem(store_name, token)
Expand Down Expand Up @@ -56,16 +56,16 @@ To play with the code, here is a starting point:
To interact with the API at a higher-level, you can use the provided
command-line interface in "adlfs/cli.py". You will need to set the appropriate
environment variables as described above to connect to the Azure Data Lake
Store.
command-line interface in "azure/datalake/store/cli.py". You will need to set
the appropriate environment variables as described above to connect to the
Azure Data Lake Store.

To start the CLI in interactive mode, run "python adlfs/cli.py" and then type
"help" to see all available commands (similiar to Unix utilities):
To start the CLI in interactive mode, run "python azure/datalake/store/cli.py"
and then type "help" to see all available commands (similiar to Unix utilities):

.. code-block:: bash
> python adlfs/cli.py
> python azure/datalake/store/cli.py
azure> help
Documented commands (type help <topic>):
Expand All @@ -84,7 +84,7 @@ familiar with the Unix/Linux "ls" command, the columns represent 1) permissions,

.. code-block:: bash
> python adlfs/cli.py
> python azure/datalake/store/cli.py
azure> ls -l
drwxrwx--- 0123abcd 0123abcd 0 Aug 02 12:44 azure1
-rwxrwx--- 0123abcd 0123abcd 1048576 Jul 25 18:33 abc.csv
Expand All @@ -104,7 +104,7 @@ named after the remote file minus the directory path.

.. code-block:: bash
> python adlfs/cli.py
> python azure/datalake/store/cli.py
azure> ls -l
drwxrwx--- 0123abcd 0123abcd 0 Aug 02 12:44 azure1
-rwxrwx--- 0123abcd 0123abcd 1048576 Jul 25 18:33 abc.csv
Expand All @@ -125,7 +125,7 @@ For example, listing the entries in the home directory:

.. code-block:: bash
> python adlfs/cli.py ls -l
> python azure/datalake/store/cli.py ls -l
drwxrwx--- 0123abcd 0123abcd 0 Aug 02 12:44 azure1
-rwxrwx--- 0123abcd 0123abcd 1048576 Jul 25 18:33 abc.csv
-r-xr-xr-x 0123abcd 0123abcd 36 Jul 22 18:32 xyz.csv
Expand All @@ -137,7 +137,7 @@ Also, downloading a remote file:

.. code-block:: bash
> python adlfs/cli.py get xyz.csv
> python azure/datalake/store/cli.py get xyz.csv
2016-08-04 18:57:48,603 - ADLFS - DEBUG - Creating empty file xyz.csv
2016-08-04 18:57:48,604 - ADLFS - DEBUG - Fetch: xyz.csv, 0-36
2016-08-04 18:57:49,726 - ADLFS - DEBUG - Downloaded to xyz.csv, byte offset 0
Expand Down
2 changes: 1 addition & 1 deletion appveyor.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ install:
build: off

test_script:
- "%PYTHON%\\Scripts\\pytest.exe -x -vvv --doctest-modules --pyargs adlfs tests"
- "%PYTHON%\\Scripts\\pytest.exe -x -vvv --doctest-modules --pyargs azure.datalake.store tests"
1 change: 1 addition & 0 deletions azure/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__import__('pkg_resources').declare_namespace(__name__)
1 change: 1 addition & 0 deletions azure/datalake/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__import__('pkg_resources').declare_namespace(__name__)
File renamed without changes.
8 changes: 4 additions & 4 deletions adlfs/cli.py → azure/datalake/store/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@
import stat
import sys

from adlfs.core import AzureDLFileSystem
from adlfs.lib import auth
from adlfs.multithread import ADLDownloader, ADLUploader
from adlfs.utils import write_stdout
from azure.datalake.store.core import AzureDLFileSystem
from azure.datalake.store.lib import auth
from azure.datalake.store.multithread import ADLDownloader, ADLUploader
from azure.datalake.store.utils import write_stdout


class AzureDataLakeFSCommand(cmd.Cmd, object):
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
4 changes: 2 additions & 2 deletions adlfs/multithread.py → azure/datalake/store/multithread.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ class ADLDownloader(object):
See Also
--------
adlfs.transfer.ADLTransferClient
azure.datalake.store.transfer.ADLTransferClient
"""
def __init__(self, adlfs, rpath, lpath, nthreads=None, chunksize=2**28,
buffersize=2**22, blocksize=2**22, client=None, run=True,
Expand Down Expand Up @@ -237,7 +237,7 @@ class ADLUploader(object):
See Also
--------
adlfs.transfer.ADLTransferClient
azure.datalake.store.transfer.ADLTransferClient
"""
def __init__(self, adlfs, rpath, lpath, nthreads=None, chunksize=2**28,
buffersize=2**22, blocksize=2**22, client=None, run=True,
Expand Down
4 changes: 2 additions & 2 deletions adlfs/transfer.py → azure/datalake/store/transfer.py
Original file line number Diff line number Diff line change
Expand Up @@ -216,8 +216,8 @@ class ADLTransferClient(object):
See Also
--------
adlfs.multithread.ADLDownloader
adlfs.multithread.ADLUploader
azure.datalake.store.multithread.ADLDownloader
azure.datalake.store.multithread.ADLUploader
"""

def __init__(self, adlfs, name, transfer, merge=None, nthreads=None,
Expand Down
4 changes: 2 additions & 2 deletions adlfs/utils.py → azure/datalake/store/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,9 @@
WIN = platform.platform() == 'Windows'

if WIN:
datadir = os.path.join(os.environ['APPDATA'], 'adlfs')
datadir = os.path.join(os.environ['APPDATA'], 'azure-datalake-store')
else:
datadir = os.sep.join([os.path.expanduser("~"), '.config', 'adlfs'])
datadir = os.sep.join([os.path.expanduser("~"), '.config', 'azure-datalake-store'])

try:
os.makedirs(datadir)
Expand Down
8 changes: 4 additions & 4 deletions docs/source/api.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
API
===

.. currentmodule:: adlfs.core
.. currentmodule:: azure.datalake.store.core

.. autosummary::
AzureDLFileSystem
Expand Down Expand Up @@ -31,18 +31,18 @@ API
AzureDLFile.tell
AzureDLFile.write

.. currentmodule:: adlfs.multithread
.. currentmodule:: azure.datalake.store.multithread

.. autosummary::
ADLUploader
ADLDownloader

.. currentmodule:: adlfs.core
.. currentmodule:: azure.datalake.store.core

.. autoclass:: AzureDLFileSystem
:members:

.. currentmodule:: adlfs.multithread
.. currentmodule:: azure.datalake.store.multithread

.. autoclass:: AzureDLFile
:members:
Expand Down
18 changes: 9 additions & 9 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# adlfs documentation build configuration file, created by
# azure-datalake-store documentation build configuration file, created by
# sphinx-quickstart on Mon Mar 21 15:20:01 2016.
#
# This file is execfile()d with the current directory set to its
Expand Down Expand Up @@ -53,7 +53,7 @@
master_doc = 'index'

# General information about the project.
project = 'adlfs'
project = 'azure-datalake-store'
copyright = 'TBD'
author = 'TBD'

Expand All @@ -62,8 +62,8 @@
# built documents.
#
# The short X.Y version.
import adlfs
version = adlfs.__version__
import azure.datalake.store
version = azure.datalake.store.__version__
# The full version, including alpha/beta/rc tags.
release = version

Expand Down Expand Up @@ -213,7 +213,7 @@
#html_search_scorer = 'scorer.js'

# Output file base name for HTML help builder.
htmlhelp_basename = 'adlfsdoc'
htmlhelp_basename = 'azure-datalake-store-doc'

# -- Options for LaTeX output ---------------------------------------------

Expand All @@ -235,7 +235,7 @@
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'adlfs.tex', 'adlfs Documentation',
(master_doc, 'azure-datalake-store.tex', 'azure-datalake-store Documentation',
'TBA', 'manual'),
]

Expand Down Expand Up @@ -265,7 +265,7 @@
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'adlfs', 'adlfs Documentation',
(master_doc, 'azure-datalake-store', 'azure-datalake-store Documentation',
[author], 1)
]

Expand All @@ -279,8 +279,8 @@
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'adlfs', 'adlfs Documentation',
author, 'adlfs', 'One line description of project.',
(master_doc, 'azure-datalake-store', 'azure-datalake-store Documentation',
author, 'azure-datalake-store', 'One line description of project.',
'Miscellaneous'),
]

Expand Down
8 changes: 4 additions & 4 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
adlfs
====
azure-datalake-store
====================

A pure-python interface to the Azure Data-lake Storage system, providing
pythonic file-system and file objects, seamless transition between Windows and
Expand All @@ -12,7 +12,7 @@ Installation
------------
Using ``pip``::

pip install adlfs
pip install azure-datalake-store

Manually (bleeding edge):

Expand Down Expand Up @@ -90,7 +90,7 @@ forward slashes or backslashes.
import pathlib # only >= Python 3.4
from pathlib2 import pathlib # only <= Python 3.3
from adlfs.core import AzureDLPath
from azure.datalake.store.core import AzureDLPath
# possible remote paths to use on API
p1 = '\\foo\\bar'
Expand Down
8 changes: 6 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,15 +3,19 @@
import os
from setuptools import setup

setup(name='adlfs',
setup(name='azure-datalake-store',
version='0.0.1',
description='Convenient Filesystem interface to Azure Data-lake Store',
url='https://github.com/Azure/azure-data-lake-store-python',
maintainer='',
maintainer_email='',
license='',
keywords='azure',
packages=['adlfs'],
packages=[
'azure',
'azure.datalake',
'azure.datalake.store'
],
install_requires=[open('requirements.txt').read().strip().split('\n')],
long_description=(open('README.rst').read() if os.path.exists('README.rst')
else ''),
Expand Down
6 changes: 3 additions & 3 deletions tests/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
To run the test suite:

py.test -x -vvv --doctest-modules --pyargs adlfs tests
py.test -x -vvv --doctest-modules --pyargs azure-datalake-store tests

This test suite uses [VCR.py](https://github.com/kevin1024/vcrpy) to record the
responses from Azure. Borrowing from VCR's
Expand All @@ -11,11 +11,11 @@ invoking the test suite (defaults to `none`).

To record responses for a new test without updating previous recordings:

RECORD_MODE=once py.test -x -vvv --doctest-modules --pyargs adlfs tests
RECORD_MODE=once py.test -x -vvv --doctest-modules --pyargs azure-datalake-store tests

To record responses for all tests even if previous recordings exist:

RECORD_MODE=all py.test -x -vvv --doctest-modules --pyargs adlfs tests
RECORD_MODE=all py.test -x -vvv --doctest-modules --pyargs azure-datalake-store tests

When recording new responses, you will need valid Azure credentials. The following
environment variables should be defined:
Expand Down
8 changes: 4 additions & 4 deletions tests/benchmarks.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
import sys
import time

from adlfs import core, multithread
from adlfs.transfer import ADLTransferClient
from adlfs.utils import WIN
from azure.datalake.store import core, multithread
from azure.datalake.store.transfer import ADLTransferClient
from azure.datalake.store.utils import WIN
from tests.testing import md5sum


Expand Down Expand Up @@ -158,7 +158,7 @@ def bench_download_50_1gb(adl, lpath, rpath, config):
# Log only Azure messages, ignoring 3rd-party libraries
logging.basicConfig(
format='%(asctime)s %(name)-17s %(levelname)-8s %(message)s')
logger = logging.getLogger('adlfs')
logger = logging.getLogger('azure.datalake.store')
logger.setLevel(logging.INFO)

# Required setup until outstanding issues are resolved
Expand Down
2 changes: 1 addition & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@

import pytest

from adlfs import AzureDLFileSystem
from azure.datalake.store import AzureDLFileSystem
from tests import settings
from tests.testing import working_dir

Expand Down
2 changes: 1 addition & 1 deletion tests/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
import os
import time

from adlfs.lib import auth
from azure.datalake.store.lib import auth
from tests import fake_settings


Expand Down
2 changes: 1 addition & 1 deletion tests/test_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

import pytest

from adlfs.cli import AzureDataLakeFSCommand
from azure.datalake.store.cli import AzureDataLakeFSCommand
from tests.testing import azure, my_vcr, working_dir


Expand Down
Loading

0 comments on commit 27be9e6

Please sign in to comment.