Compare commits

..

42 Commits

Author SHA1 Message Date
Tim Abbott
8a1e20f734 Release Zulip Server 1.8.1. 2018-05-07 15:25:25 -07:00
Tim Abbott
93d4c807a9 Add changelog for 1.8.1 release. 2018-05-07 15:24:45 -07:00
Tim Abbott
4741e683ce generate_secrets: Fix handling of an empty secrets file.
This is now a condition that happens during installation, because we
now create an empty file for this in puppet.
2018-05-07 09:38:05 -07:00
Tim Abbott
d5252ff0c9 puppet: Ensure zulip user owns key /etc/zulip files.
The main purpose of this change is to make it guaranteed that
`manage.py register_server --rotate-key` can edit the
/etc/zulip/zulip-secrets.conf configuration via crudini.

But it also adds value by ensuring zulip-secrets.conf is not readable
by other users.
2018-05-07 09:38:05 -07:00
Tim Abbott
3d003a8f34 zilencer: Add automated signup system for push notifications.
With backported bug fixes to the tool included.

Based on an initial version by Rishi Gupta.

Fixes #7325.
2018-05-07 09:38:05 -07:00
Tim Abbott
1ec0414786 management: Refactor checkconfig code to live in library.
This makes it possible to call this from other management commands.
2018-05-07 09:38:05 -07:00
Rohitt Vashishtha
fd06380701 push_notifications: Format blockquotes properly in text_output.
New output is of the format:

Hamlet said:
> Polonius said:
> > This is the.
> > Second layer of nesting.
> First layer of nesting.

Fixes #9251.
2018-05-07 09:38:05 -07:00
Rohitt Vashishtha
d345564ce2 sidebars: Disable autocomplete for user and stream search inputs.
Fixes #9269.
2018-05-07 09:38:05 -07:00
Tim Abbott
dbf19ae3e3 compose: Don't auto-scroll very tall messages on click.
This fixes an issue where with very tall messages (more than about a
screen in height), one would end up scrolled to the bottom of the
message if you clicked on it, which usually felt annoying.

Fixes #8941.
2018-05-07 09:38:05 -07:00
Steve Howell
63437e89d7 Fix regression with topic edits clearing narrows.
We had a recent regression that had kind of a funny symptom.
If somebody else edited a topic while you were in a topic
narrow, even if wasn't your topic, then your narrow would
mysteriously go empty upon the event coming to you.

The root cause of this is that when topic names change,
we often want to rerender a lot of the world due to muting.
But we want to suppress the re-render for topic narrows that
don't support the internal data structures.

This commit restores a simple guard condition that got lost
in a recent refactoring:

        see 3f736c9b06

From tabbott: This is not the correct ultimate place to end up,
because if a topic-edit moves messages in or out of a topic, the new
behavior is wrong.  But the bug this fixes is a lot worse than that,
and no super local change would do the right thing here.
2018-05-07 09:38:05 -07:00
Tim Abbott
0a065636c9 message_list: Fix hiding messages edited to a muted topic.
Previously, we did a rerender without first re-computing which
messages were muted; this was incorrect, because whether a message is
muted can change if the topic changes.

Fixes #9241.
2018-05-07 09:38:05 -07:00
Tim Abbott
d61a8c96c5 message_list: Clean up API for rerender_after_muting_changes.
This was only called from two places in one function, and we can just
check muting_enabled in the caller.

This refactor is important, because we might need to update muting
after other changes (specifically, message editing to move a topic to
be muted/non-muted).
2018-05-07 09:38:04 -07:00
Tim Abbott
e346044d6a email_mirror: Fix handling of empty topic.
Also fixs some corner cases around pure-whitespace topics, and
migrates from the years-obsolete "no subject".

Fixes #9207.
2018-05-07 09:38:04 -07:00
Shubham Dhama
b1ff1633b1 settings: Make bot-settings tabs look better in dark mode.
Fixes: #9230.
2018-05-07 09:38:04 -07:00
Shubham Dhama
1b253cb9e0 org settings: Fix word-wrapping CSS in "users" section.
Fixes: #9225.
2018-05-07 09:38:04 -07:00
Shubham Dhama
f612274f91 settings: Fix escaping of HTML in checkbox labels.
Some labels like one for `translate_emoticons` which contains HTML
get escaped because of use of `{{ label }}` syntax, which escapes
the string for XSS security purpose but since labels aren't any
threat to any such security cases, we can use triple curly brackets
`{{{ label }}}` syntax.

Fixes: #9231.
2018-05-07 09:38:04 -07:00
Tim Abbott
a9e6ad5c6a ldap: Disable django-auth-ldap caching of users.
This shouldn't have a material performance impact, since we don't
query these except during login, and meanwhile this fixes an issue
where users needed to restart memcached (which usually manifested as
"needing to reboot the whole server" after updating their LDAP
configuration before a user who was migrated from one OU to another
could login).

Fixes #9057.
2018-05-07 09:38:04 -07:00
Tim Abbott
eae16d42d4 unread: Fix messages that cannot be marked as read in narrows.
If you visit a narrow that has unread messages on it that aren't part
of the home view (e.g. in a muted stream), then we were never calling
`message_util.do_unread_count_updates`, and more importantly,
`unread.process_loaded_messages` on those messages.  As a result, they
would be unread, and moving the cursor over them would never mark
those messages as read (which was visible through the little green
marker never disappearing).

I can't tell whether this fixes #8042 and/or #8236; neither of them
exactly fits the description of this issue unless the PM threads in
question were muted or something, but this does feel related.
2018-05-07 09:38:04 -07:00
Tim Abbott
ca221da997 left sidebar: Fix clipping of private message users with "g" in name.
This fixes an issue where users whose names had a "g" in them would
have the "g" clipped in the "private messages" section in the left sidebar.

We avoid a change in the effective visible line-height by shrinking
the margin.
2018-05-07 09:38:04 -07:00
Cynthia Lin
c16b252699 analytics: Eliminate slider-focused text selection in Firefox.
Fixes #9151.
2018-05-07 09:38:04 -07:00
Tim Abbott
e3f8108ca6 gitlab: Document the local network security setting.
This should help users debug issues with the GitLab webhook not
working with recent GitLab releases.
2018-05-07 09:38:04 -07:00
Cynthia Lin
5530fe8cb1 night-mode: Change coloring for compose close button. 2018-05-07 09:38:04 -07:00
Cynthia Lin
fca8479065 compose: Refactor compose box from <table> to <div> structure.
`<td>` elements are fixed-width, so we refactor the entire
`<table>` structure for responsive design.

This fixes a bug with how the `To:` block looks in other languages.

Fixes #9152.
2018-05-07 09:38:04 -07:00
Shubham Dhama
fe34001dd1 settings: Make saving spinner visible in night mode.
Fixes: #9154.
2018-05-07 09:38:03 -07:00
Tim Abbott
9a6b4aeda2 puppet: Allow manual configuration of postfix_mailname.
This allows users to configure a mailname for postfix in
/etc/zulip/zulip.conf
2018-05-07 09:38:03 -07:00
Greg Price
76957a62a5 install: Expand error message for missing SSL cert slightly.
It wasn't obvious reading this message that you can perfectly well
bring your own SSL/TLS certificate; unless you read quite a bit
between the lines where we say "could not find", or followed the link
to the detailed docs, the message sounded like you had to either use
--certbot or --self-signed-cert.

So, explicitly mention the BYO option.  Because the "complete chain"
requirement is a bit tricky, don't try to give instructions for it
in this message; just refer the reader to the docs.

Also, drop the logic to identify which of the files is missing; it
certainly makes the code more complex, and I think even the error
message is actually clearer when it just gives the complete list of
required files -- it's much more likely that the reader doesn't know
what's required than that they do and have missed one, and even then
it's easy for them to look for themselves.
2018-05-07 09:38:03 -07:00
Cynthia Lin
76f6d9aaa2 night-mode: Add borders to input pill containers. 2018-05-07 09:38:03 -07:00
Cynthia Lin
5d9eadb734 compose: Fix styling of PM recipient input pill.
Fixes #9128.
2018-05-07 09:38:03 -07:00
Tim Abbott
cb8941a081 left sidebar: Fix line-height causing clipping.
Fixes #8209.
2018-05-07 09:38:03 -07:00
Tim Abbott
062df3697a slack import: Fix issues with Slack empty files.
Fixes #9217.
2018-05-07 09:38:03 -07:00
Preston Hansen
ad113134c7 slack import: Update build_zerver_realm to use Realm defaults.
Fixes #9131.
2018-05-07 09:38:03 -07:00
Tim Abbott
c4b2e986c3 test-backend: Update coverage excludes for new import_realm.py. 2018-05-07 09:38:03 -07:00
Tim Abbott
1b49c5658c import: Split out import.py into its own module.
This should make it a bit easier to find the code.
2018-05-07 09:38:03 -07:00
Preston Hansen
cbdb3d6bbf slack import: Be less strict in check_subdomain_available.
If the sysadmin is doing something explicit in a management command,
it's OK to take a reserved or short subdomain.

Fixes #9166.
2018-05-07 09:38:03 -07:00
Tim Abbott
97ccdacb18 import: Fix ordering of subdomain availability check.
When you're importing with --destroy-rebuild-database, we need to
check subdomain availability after we've cleared out the database;
otherwise, trying to reuse the same subdomain doesn't work.
2018-05-07 09:38:03 -07:00
Tim Abbott
e96af7906d slack import: Document how to send password resets to all users.
This is likely to be an important follow-up step after one finishes
the Slack import.
2018-05-07 09:38:03 -07:00
Tim Abbott
0d1e401922 slack import: Fix documentation on path to run manage.py. 2018-05-07 09:38:02 -07:00
Tim Abbott
8b599c1ed7 slack import: Don't try to import pinned/unpinned items.
There isn't a corresponding Zulip concept, and they don't have a
"text" attribute, so there's no message content to import.
2018-05-07 09:38:02 -07:00
Tim Abbott
a852532c95 slack import: Refactor handling of dropped messages.
This is a more coherent ordering, because some messages we skip lack a
"text" attribute.
2018-05-07 09:38:02 -07:00
Tim Abbott
8e57a3958d slack import: Improve error handling for invalid messages. 2018-05-07 09:38:02 -07:00
Tim Abbott
86046ae9c3 slack import: Remove unnecessary zerver_realm_skeleton.json.
This was stored as a fixture file under zerver/fixtures, which caused
problems, since we don't show that directory under production (as its
part of the test system).

The simplest emergency fix here would be to just move the file, but
when looking at it, it's clear that we don't need or want a fixture
file here; we want a Python object, so we just do that.

A valuable follow-up improvement to this block would be to create an
actual new Realm object (not saved to the database), and dump it the
same code we use in the export tool; that should handle the vast
majority of these correctly.

Fixes #9123.
2018-05-07 09:38:02 -07:00
Tim Abbott
c0096932a6 stream_data: Fix exception when notifications_stream is private.
If notifications_stream is private and the current user has never been
subscribed, then we would throw an exception when trying to look up
notifications_stream.  In this situation, we should just treat it like
the stream doesn't exist for the purposes of this user.
2018-05-07 09:38:02 -07:00
4728 changed files with 271299 additions and 552903 deletions

View File

@@ -1,6 +0,0 @@
> 0.2%
> 0.2% in US
last 2 versions
Firefox ESR
not dead
Chrome 26 # similar to PhantomJS

View File

@@ -1,373 +1,145 @@
# See https://zulip.readthedocs.io/en/latest/testing/continuous-integration.html for
# high-level documentation on our CircleCI setup.
# See CircleCI upstream's docs on this config format: # See CircleCI upstream's docs on this config format:
# https://circleci.com/docs/2.0/language-python/ # https://circleci.com/docs/2.0/language-python/
# #
version: 2.0 version: 2
aliases:
- &create_cache_directories
run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- &restore_cache_package_json
restore_cache:
keys:
- v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &restore_cache_requirements
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &restore_emoji_cache
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- &install_dependencies
run:
name: install dependencies
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
- &save_cache_package_json
save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &save_cache_requirements
save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &save_emoji_cache
save_cache:
paths:
- /srv/zulip-emoji-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- &do_bionic_hack
run:
name: do Bionic hack
command: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- &run_backend_tests
run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/backend 2>&1" ts
- &run_frontend_tests
run:
name: run frontend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
- &upload_coverage_report
run:
name: upload coverage report
command: |
# codecov requires `.coverage` file to be stored in pwd for
# uploading coverage results.
mv /home/circleci/zulip/var/.coverage /home/circleci/zulip/.coverage
. /srv/zulip-py3-venv/bin/activate
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
- &build_production
run:
name: build production
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "./tools/ci/production-build 2>&1" ts
- &production_extract_tarball
run:
name: production extract tarball
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "/tmp/production-extract-tarball 2>&1" ts
- &install_production
run:
name: install production
command: |
sudo service rabbitmq-server restart
sudo --preserve-env=CIRCLECI mispipe "/tmp/production-install 2>&1" ts
- &verify_production
run:
name: verify install
command: |
sudo --preserve-env=CIRCLECI mispipe "/tmp/production-verify 2>&1" ts
- &upgrade_postgresql
run:
name: upgrade postgresql
command: |
sudo --preserve-env=CIRCLECI mispipe "/tmp/production-upgrade-pg 2>&1" ts
- &check_xenial_provision_error
run:
name: check tools/provision error message on xenial
command: |
! tools/provision > >(tee provision.out)
grep -Fqx 'CRITICAL:root:Unsupported platform: ubuntu 16.04' provision.out
- &check_xenial_upgrade_error
run:
name: check scripts/lib/upgrade-zulip-stage-2 error message on xenial
command: |
! sudo scripts/lib/upgrade-zulip-stage-2 2> >(tee upgrade.err >&2)
grep -Fq 'upgrade-zulip-stage-2: Unsupported platform: ubuntu 16.04' upgrade.err
- &notify_failure_status
run:
name: On fail
when: on_fail
branches:
only: master
command: |
if [[ "$CIRCLE_REPOSITORY_URL" == "git@github.com:zulip/zulip.git" && "$ZULIP_BOT_KEY" != "" ]]; then
curl -H "Content-Type: application/json" \
-X POST -i 'https://chat.zulip.org/api/v1/external/circleci?api_key='"$ZULIP_BOT_KEY"'&stream=automated%20testing&topic=master%20failing' \
-d '{"payload": { "branch": "'"$CIRCLE_BRANCH"'", "reponame": "'"$CIRCLE_PROJECT_REPONAME"'", "status": "failed", "build_url": "'"$CIRCLE_BUILD_URL"'", "username": "'"$CIRCLE_USERNAME"'"}}'
fi
jobs: jobs:
"bionic-backend-frontend": "trusty-python-3.4":
docker: docker:
# This is built from tools/ci/images/bionic/Dockerfile . # This is built from tools/circleci/images/trusty/Dockerfile .
# Bionic ships with Python 3.6. - image: gregprice/circleci:trusty-python-4.test
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip working_directory: ~/zulip
steps: steps:
- checkout - checkout
- *create_cache_directories - run:
- *do_bionic_hack name: create cache directories
- *restore_cache_package_json command: |
- *restore_cache_requirements dirs=(/srv/zulip-{npm,venv}-cache)
- *restore_emoji_cache sudo mkdir -p "${dirs[@]}"
- *install_dependencies sudo chown -R circleci "${dirs[@]}"
- *save_cache_package_json
- *save_cache_requirements - restore_cache:
- *save_emoji_cache keys:
- *run_backend_tests - v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run: - run:
name: test locked requirements name: install dependencies
command: |
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/travis/setup-backend" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
# TODO: in Travis we also cache ~/zulip-emoji-cache, ~/node, ~/misc
# The moment of truth! Run the tests.
- run:
name: run backend tests
command: | command: |
. /srv/zulip-py3-venv/bin/activate . /srv/zulip-py3-venv/bin/activate
mispipe "./tools/test-locked-requirements 2>&1" ts mispipe ./tools/travis/backend ts
- *run_frontend_tests
# We only need to upload coverage reports on whichever platform
# runs the frontend tests.
- *upload_coverage_report
- store_artifacts:
path: ./var/casper/
destination: casper
- store_artifacts:
path: ./var/puppeteer/
destination: puppeteer
- store_test_results:
path: ./var/xunit-test-results/casper/
- *notify_failure_status
"focal-backend":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
- *run_backend_tests
- run:
name: Check development database build
command: mispipe "tools/ci/setup-backend" ts
- *notify_failure_status
"xenial-legacy":
docker:
- image: arpit551/circleci:xenial-python-test
working_directory: ~/zulip
steps:
- checkout
- *check_xenial_provision_error
- *check_xenial_upgrade_error
- *notify_failure_status
"bionic-production-build":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *do_bionic_hack
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *build_production
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
# Persist the built tarball to be used in downstream job
# for installation of production server.
# See https://circleci.com/docs/2.0/workflows/#using-workspaces-to-share-data-among-jobs
- persist_to_workspace:
# Must be an absolute path,
# or relative path from working_directory.
# This is a directory on the container which is
# taken to be the root directory of the workspace.
root: /tmp/production-build
# Must be relative path from root
paths:
- "*"
- *notify_failure_status
"bionic-production-install":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
- *do_bionic_hack
- *production_extract_tarball
- *restore_cache_package_json
- *install_production
- *verify_production
- *upgrade_postgresql
- *verify_production
- *save_cache_package_json
- *notify_failure_status
"focal-production-install":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
- run: - run:
name: do memcached hack name: run frontend tests
command: | command: |
# Temporary hack till memcached upstream is updated in Focal. . /srv/zulip-py3-venv/bin/activate
# https://bugs.launchpad.net/ubuntu/+source/memcached/+bug/1878721 mispipe ./tools/travis/frontend ts
echo "export SASL_CONF_PATH=/etc/sasl2" | sudo tee - a /etc/default/memcached
- *production_extract_tarball - run:
- *restore_cache_package_json name: upload coverage report
- *install_production command: |
- *verify_production . /srv/zulip-py3-venv/bin/activate
- *save_cache_package_json pip install codecov && codecov \
- *notify_failure_status || echo "Error in uploading coverage reports to codecov.io."
# - store_artifacts: # TODO
# path: var/casper/
# # also /tmp/zulip-test-event-log/
# destination: test-reports
"xenial-python-3.5":
docker:
# This is built from tools/circleci/images/xenial/Dockerfile .
- image: gregprice/circleci:xenial-python-3.test
working_directory: ~/zulip
steps:
- checkout
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- restore_cache:
keys:
- v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: install dependencies
command: |
sudo apt-get install -y moreutils
rm -f /home/circleci/.gitconfig
mispipe "tools/travis/setup-backend" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/backend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
workflows: workflows:
version: 2 version: 2
"Ubuntu 16.04 Xenial (Python 3.5, legacy)": build:
jobs: jobs:
- "xenial-legacy" - "trusty-python-3.4"
"Ubuntu 18.04 Bionic (Python 3.6, backend+frontend)": - "xenial-python-3.5"
jobs:
- "bionic-backend-frontend"
"Ubuntu 20.04 Focal (Python 3.8, backend)":
jobs:
- "focal-backend"
"Production":
jobs:
- "bionic-production-build"
- "bionic-production-install":
requires:
- "bionic-production-build"
- "focal-production-install":
requires:
- "bionic-production-build"

View File

@@ -5,8 +5,6 @@ coverage:
project: project:
default: default:
target: auto target: auto
# Codecov has the tendency to report a lot of false negatives, threshold: 0.50
# so we basically suppress comments completely.
threshold: 50%
base: auto base: auto
patch: off patch: off

View File

@@ -3,22 +3,17 @@ root = true
[*] [*]
end_of_line = lf end_of_line = lf
charset = utf-8 charset = utf-8
indent_style = space
insert_final_newline = true
trim_trailing_whitespace = true trim_trailing_whitespace = true
insert_final_newline = true
[*.{sh,py,pyi,xml,css,scss,hbs,html}] [*.{sh,py,pyi,js,json,yml,xml,css,md,markdown,handlebars,html}]
indent_style = space
indent_size = 4 indent_size = 4
[{*.{js,json,ts},check-openapi}] [*.{svg,rb,pp,pl}]
indent_size = 4 indent_style = space
max_line_length = 100
[*.{py,pyi}]
max_line_length = 110
[*.{svg,rb,pp,yaml,yml}]
indent_size = 2 indent_size = 2
[package.json] [*.{cfg}]
indent_size = 2 indent_style = space
indent_size = 8

View File

@@ -1,10 +1,2 @@
# This is intended for generated files and vendored third-party files. static/js/blueslip.js
# For our source code, instead of adding files here, consider using static/webpack-bundles
# specific eslint-disable comments in the files themselves.
/docs/_build
/static/generated
/static/third
/static/webpack-bundles
/var
/zulip-py3-venv

View File

@@ -1,408 +1,349 @@
{ {
"env": { "env": {
"es2020": true, "node": true,
"node": true "es6": true
}, },
"extends": ["eslint:recommended", "plugin:import/errors", "plugin:import/warnings", "prettier"],
"parser": "@babel/eslint-parser",
"parserOptions": { "parserOptions": {
"warnOnUnsupportedTypeScriptVersion": false, "sourceType": "module"
"sourceType": "unambiguous"
}, },
"plugins": ["eslint-plugin-empty-returns"], "globals": {
"$": false,
"_": false,
"jQuery": false,
"Spinner": false,
"Handlebars": false,
"XDate": false,
"zxcvbn": false,
"LazyLoad": false,
"SockJS": false,
"marked": false,
"md5": false,
"moment": false,
"i18n": false,
"DynamicText": false,
"LightboxCanvas": false,
"bridge": false,
"page_params": false,
"attachments_ui": false,
"csrf_token": false,
"typeahead_helper": false,
"pygments_data": false,
"popovers": false,
"server_events": false,
"server_events_dispatch": false,
"message_scroll": false,
"keydown_util": false,
"info_overlay": false,
"ui": false,
"ui_report": false,
"night_mode": false,
"ui_util": false,
"lightbox": false,
"input_pill": false,
"user_pill": false,
"compose_pm_pill": false,
"stream_color": false,
"people": false,
"user_groups": false,
"navigate": false,
"toMarkdown": false,
"settings_account": false,
"settings_display": false,
"settings_notifications": false,
"settings_muting": false,
"settings_lab": false,
"settings_bots": false,
"settings_sections": false,
"settings_emoji": false,
"settings_org": false,
"settings_ui": false,
"settings_users": false,
"settings_streams": false,
"settings_filters": false,
"settings_invites": false,
"settings_user_groups": false,
"settings_profile_fields": false,
"settings": false,
"resize": false,
"loading": false,
"typing": false,
"typing_events": false,
"typing_data": false,
"typing_status": false,
"sent_messages": false,
"transmit": false,
"compose": false,
"compose_actions": false,
"compose_state": false,
"compose_fade": false,
"overlays": false,
"stream_create": false,
"stream_edit": false,
"subs": false,
"stream_muting": false,
"stream_events": false,
"timerender": false,
"message_live_update": false,
"message_edit": false,
"reload": false,
"composebox_typeahead": false,
"search": false,
"topic_list": false,
"topic_generator": false,
"gear_menu": false,
"hashchange": false,
"hash_util": false,
"FetchStatus": false,
"message_list": false,
"Filter": false,
"flatpickr": false,
"pointer": false,
"util": false,
"MessageListView": false,
"blueslip": false,
"rows": false,
"WinChan": false,
"muting_ui": false,
"Socket": false,
"channel": false,
"components": false,
"message_viewport": false,
"upload_widget": false,
"avatar": false,
"realm_icon": false,
"feature_flags": false,
"search_suggestion": false,
"notifications": false,
"message_flags": false,
"bot_data": false,
"top_left_corner": false,
"stream_sort": false,
"stream_list": false,
"stream_popover": false,
"narrow_state": false,
"narrow": false,
"admin_sections": false,
"admin": false,
"stream_data": false,
"topic_data": false,
"list_util": false,
"muting": false,
"Dict": false,
"unread": false,
"alert_words_ui": false,
"message_store": false,
"message_util": false,
"message_events": false,
"message_fetch": false,
"favicon": false,
"condense": false,
"list_render": false,
"floating_recipient_bar": false,
"tab_bar": false,
"emoji": false,
"presence": false,
"activity": false,
"invite": false,
"colorspace": false,
"reactions": false,
"tutorial": false,
"templates": false,
"alert_words": false,
"fenced_code": false,
"markdown": false,
"echo": false,
"localstorage": false,
"localStorage": false,
"current_msg_list": true,
"home_msg_list": false,
"pm_list": false,
"pm_conversations": false,
"recent_senders": false,
"unread_ui": false,
"unread_ops": false,
"upload": false,
"user_events": false,
"Plotly": false,
"emoji_codes": false,
"drafts": false,
"katex": false,
"ClipboardJS": false,
"emoji_picker": false,
"hotspots": false,
"compose_ui": false,
"common": false,
"panels": false,
"PerfectScrollbar": false
},
"plugins": [
"eslint-plugin-empty-returns"
],
"rules": { "rules": {
"array-callback-return": "error", "array-callback-return": "error",
"arrow-body-style": "error", "array-bracket-spacing": "error",
"block-scoped-var": "error", "arrow-spacing": [ "error", { "before": true, "after": true } ],
"curly": "error", "block-scoped-var": 2,
"dot-notation": "error", "brace-style": [ "error", "1tbs", { "allowSingleLine": true } ],
"empty-returns/main": "error", "camelcase": 0,
"eqeqeq": "error", "comma-dangle": [ "error",
"guard-for-in": "error",
"import/first": "error",
"import/newline-after-import": "error",
"import/no-unresolved": ["error", {"ignore": ["!"]}],
"import/order": [
"error",
{ {
"alphabetize": {"order": "asc"}, "arrays": "always-multiline",
"newlines-between": "always" "objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "never"
} }
], ],
"import/unambiguous": "error", "complexity": [ 0, 4 ],
"new-cap": [ "curly": 2,
"error", "dot-notation": [ "error", { "allowKeywords": true } ],
"empty-returns/main": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": 2,
"func-style": [ "off", "expression" ],
"guard-for-in": 2,
"keyword-spacing": [ "error",
{ {
"before": true,
"after": true,
"overrides": {
"return": { "after": true },
"throw": { "after": true },
"case": { "after": true }
}
}
],
"max-depth": [ 0, 4 ],
"max-len": [ "error", 100, 2,
{
"ignoreUrls": true,
"ignoreComments": false,
"ignoreRegExpLiterals": true,
"ignoreStrings": true,
"ignoreTemplateLiterals": true
}
],
"max-params": [ 0, 3 ],
"max-statements": [ 0, 10 ],
"new-cap": [ "error",
{
"newIsCap": true,
"capIsNew": false "capIsNew": false
} }
], ],
"no-alert": "error", "new-parens": 2,
"newline-per-chained-call": 0,
"no-alert": 2,
"no-array-constructor": "error", "no-array-constructor": "error",
"no-bitwise": "error", "no-bitwise": 2,
"no-caller": "error", "no-caller": 2,
"no-catch-shadow": "error", "no-case-declarations": "error",
"no-constant-condition": ["error", {"checkLoops": false}], "no-catch-shadow": 2,
"no-div-regex": "error", "no-console": 0,
"no-const-assign": "error",
"no-control-regex": 2,
"no-debugger": 2,
"no-delete-var": 2,
"no-div-regex": 2,
"no-dupe-class-members": "error",
"no-dupe-keys": 2,
"no-duplicate-imports": "error", "no-duplicate-imports": "error",
"no-else-return": "error", "no-else-return": 2,
"no-eq-null": "error", "no-empty": 2,
"no-eval": "error", "no-empty-character-class": 2,
"no-implied-eval": "error", "no-eq-null": 2,
"no-inner-declarations": "off", "no-eval": 2,
"no-ex-assign": 2,
"no-extra-parens": [ "error", "functions" ],
"no-extra-semi": 2,
"no-fallthrough": 2,
"no-floating-decimal": 2,
"no-func-assign": 2,
"no-implied-eval": 2,
"no-iterator": "error", "no-iterator": "error",
"no-label-var": "error", "no-label-var": 2,
"no-labels": "error", "no-labels": 2,
"no-loop-func": "error", "no-loop-func": 2,
"no-multi-str": "error", "no-mixed-requires": [ 0, false ],
"no-native-reassign": "error", "no-multi-str": 2,
"no-native-reassign": 2,
"no-nested-ternary": 0,
"no-new-func": "error", "no-new-func": "error",
"no-new-object": "error", "no-new-object": 2,
"no-new-wrappers": "error", "no-new-wrappers": 2,
"no-octal-escape": "error", "no-obj-calls": 2,
"no-plusplus": "error", "no-octal": 2,
"no-proto": "error", "no-octal-escape": 2,
"no-return-assign": "error", "no-param-reassign": 0,
"no-script-url": "error", "no-plusplus": 2,
"no-self-compare": "error", "no-proto": 2,
"no-sync": "error", "no-redeclare": 2,
"no-undef-init": "error", "no-regex-spaces": 2,
"no-unneeded-ternary": ["error", {"defaultAssignment": false}], "no-restricted-syntax": 0,
"no-unused-expressions": "error", "no-return-assign": 2,
"no-unused-vars": [ "no-script-url": 2,
"error", "no-self-compare": 2,
"no-shadow": 0,
"no-sync": 2,
"no-ternary": 0,
"no-undef": "error",
"no-undef-init": 2,
"no-underscore-dangle": 0,
"no-unneeded-ternary": [ "error", { "defaultAssignment": false } ],
"no-unreachable": 2,
"no-unused-expressions": 2,
"no-unused-vars": [ "error",
{ {
"vars": "local", "vars": "local",
"args": "after-used",
"varsIgnorePattern": "print_elapsed_time|check_duplicate_ids" "varsIgnorePattern": "print_elapsed_time|check_duplicate_ids"
} }
], ],
"no-use-before-define": "error", "no-use-before-define": 2,
"no-useless-constructor": "error", "no-useless-constructor": "error",
"no-var": "error", // The Zulip codebase complies partially with the "no-useless-escape"
"object-shorthand": "error", // rule; only regex expressions haven't been updated yet.
"one-var": ["error", "never"], // Updated regex expressions are currently being tested in casper
"prefer-arrow-callback": "error", // files and will decide about a potential future enforcement of this rule.
"prefer-const": [ "no-useless-escape": 0,
"error", "no-whitespace-before-property": 0,
"no-with": 2,
"one-var": [ "error", "never" ],
"padded-blocks": 0,
"prefer-const": [ "error",
{ {
"destructuring": "any",
"ignoreReadBeforeAssign": true "ignoreReadBeforeAssign": true
} }
], ],
"radix": "error", "quote-props": [ "error", "as-needed",
"spaced-comment": "off", {
"strict": "error", "keywords": false,
"valid-typeof": ["error", {"requireStringLiterals": true}], "unnecessary": true,
"yoda": "error" "numbers": false
},
"overrides": [
{
"files": ["frontend_tests/**", "static/js/**"],
"globals": {
"$": false,
"FetchStatus": false,
"Filter": false,
"LightboxCanvas": false,
"ListCursor": false,
"MessageListData": false,
"MessageListView": false,
"UserSearch": false,
"activity": false,
"admin": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"billing": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"color_data": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"dropdown_list_widget": false,
"echo": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"feedback_widget": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"helpers": false,
"history": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"keydown_util": false,
"lightbox": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"location": false,
"markdown": false,
"message_edit": false,
"message_edit_history": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"pill_typeahead": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pm_list_dom": false,
"pointer": false,
"popovers": false,
"presence": false,
"reactions": false,
"realm_icon": false,
"realm_logo": false,
"realm_night_logo": false,
"recent_senders": false,
"recent_topics": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_exports": false,
"settings_linkifiers": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"spoilers": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_topic_history": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"stream_ui_updates": false,
"StripeCheckout": false,
"submessage": false,
"subs": false,
"message_view_header": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"todo_widget": false,
"top_left_corner": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"ui": false,
"ui_init": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upgrade": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_status": false,
"user_status_ui": false,
"poll_widget": false,
"vdom": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
} }
}, ],
{ "quotes": [ 0, "single" ],
"files": ["**/*.ts"], "radix": 2,
"extends": [ "semi": 2,
"plugin:@typescript-eslint/recommended", "space-before-blocks": 2,
"plugin:import/typescript", "space-before-function-paren": [ "error",
"prettier/@typescript-eslint" {
], "anonymous": "always",
"parserOptions": { "named": "never",
"project": "tsconfig.json" "asyncArrow": "always"
},
"rules": {
// Disable base rule to avoid conflict
"empty-returns/main": "off",
"no-unused-vars": "off",
"no-useless-constructor": "off",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/consistent-type-assertions": "error",
"@typescript-eslint/explicit-function-return-type": [
"error",
{"allowExpressions": true}
],
"@typescript-eslint/member-ordering": "error",
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-non-null-assertion": "off",
"@typescript-eslint/no-parameter-properties": "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-unused-vars": ["error", {"varsIgnorePattern": "^_"}],
"@typescript-eslint/no-use-before-define": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-regexp-exec": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/unified-signatures": "error"
} }
}, ],
{ "space-in-parens": 2,
"files": ["**/*.d.ts"], "space-infix-ops": 0,
"rules": { "spaced-comment": 0,
"import/unambiguous": "off" "strict": 0,
} "template-curly-spacing": "error",
}, "unnecessary-strict": 0,
{ "use-isnan": 2,
"files": ["frontend_tests/**"], "valid-typeof": [ "error", { "requireStringLiterals": true } ],
"globals": { "wrap-iife": [ "error", "outside", { "functionPrototypeMethods": false } ],
"assert": false, "wrap-regex": 0,
"casper": false, "yoda": 2
"document": false, }
"set_global": false,
"window": false,
"with_field": false,
"zrequire": false
},
"rules": {
"no-sync": "off"
}
},
{
"files": [
"frontend_tests/casper_lib/**",
"frontend_tests/casper_tests/**",
"tools/debug-require.js"
],
"env": {
"browser": true,
"es2020": false
},
"rules": {
// Dont require ES features that PhantomJS doesnt support
"no-var": "off",
"object-shorthand": "off",
"prefer-arrow-callback": "off"
}
},
{
"files": ["static/**"],
"env": {
"browser": true,
"commonjs": true,
"node": false
},
"rules": {
"no-console": "error"
}
},
{
"files": ["static/shared/**"],
"env": {
"browser": false,
"commonjs": false,
"shared-node-browser": true
},
"rules": {
"import/no-restricted-paths": [
"error",
{
"zones": [
{
"target": "./static/shared",
"from": ".",
"except": ["./node_modules", "./static/shared"]
}
]
}
]
}
}
]
} }

2
.gitattributes vendored
View File

@@ -10,4 +10,4 @@
*.png binary *.png binary
*.otf binary *.otf binary
*.tif binary *.tif binary
*.ogg binary yarn.lock binary

View File

@@ -1,41 +0,0 @@
name: Cancel Previous Runs
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
cancel:
name: Cancel Previous Runs
runs-on: ubuntu-latest
timeout-minutes: 3
# Don't run this job for zulip/zulip pushes since we
# want to run those jobs.
if: ${{ github.event_name != 'push' || github.event.repository.full_name != 'zulip/zulip' }}
steps:
# We get workflow IDs from GitHub API so we don't have to maintain
# a hard-coded list of IDs which need to be updated when a workflow
# is added or removed. And, workflow IDs are different for other forks
# so this is required.
- name: Get workflow IDs.
id: workflow_ids
env:
# This is in <owner>/<repo> format e.g. zulip/zulip
REPOSITORY: ${{ github.repository }}
run: |
workflow_api_url=https://api.github.com/repos/$REPOSITORY/actions/workflows
curl $workflow_api_url -o workflows.json
script="const {workflows} = require('./workflows'); \
const ids = workflows.map(workflow => workflow.id); \
console.log(ids.join(','));"
ids=$(node -e "$script")
echo "::set-output name=ids::$ids"
- uses: styfle/cancel-workflow-action@0.4.1
with:
workflow_id: ${{ steps.workflow_ids.outputs.ids }}
access_token: ${{ github.token }}

View File

@@ -1,30 +0,0 @@
name: "Code Scanning"
on: [push, pull_request]
jobs:
CodeQL:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then checkout
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,182 +0,0 @@
name: Zulip Production Suite
on:
push:
paths:
- "**/migrations/**"
- puppet/**
- requirements/**
- scripts/**
- static/**
- tools/**
- zproject/**
- yarn.lock
- .github/workflows/production-suite.yml
pull_request:
paths:
- "**/migrations/**"
- puppet/**
- requirements/**
- scripts/**
- static/**
- tools/**
- zproject/**
- yarn.lock
- .github/workflows/production-suite.yml
defaults:
run:
shell: bash
jobs:
production_build:
name: Bionic Production Build
runs-on: ubuntu-latest
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
container: mepriyank/actions:bionic
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ github.job }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ github.job }}
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ github.job }}-${{ hashFiles('requirements/thumbor-dev.txt') }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ github.job }}
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ github.job }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ github.job }}
- name: Do Bionic hack
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Build production tarball
run: mispipe "./tools/ci/production-build 2>&1" ts
- name: Upload production build artifacts for install jobs
uses: actions/upload-artifact@v2
with:
name: production-tarball
path: /tmp/production-build
production_install:
strategy:
fail-fast: false
matrix:
include:
- docker_image: mepriyank/actions:bionic
name: Bionic Production Install
is_bionic: true
os: bionic
- docker_image: mepriyank/actions:focal
name: Focal Production Install
is_focal: true
os: focal
name: ${{ matrix.name }}
container: ${{ matrix.docker_image }}
runs-on: ubuntu-latest
needs: production_build
steps:
- name: Download built production tarball
uses: actions/download-artifact@v2
with:
name: production-tarball
path: /tmp
- name: Add required permissions and setup
run: |
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
# Create the zulip directory that the tools/ci/ scripts needs
mkdir -p /home/github/zulip
# Since actions/download-artifact@v2 loses all the permissions
# of the tarball uploaded by the upload artifact fix those.
chmod +x /tmp/production-extract-tarball
chmod +x /tmp/production-upgrade-pg
chmod +x /tmp/production-install
chmod +x /tmp/production-verify
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('/tmp/package.json') }}-${{ hashFiles('/tmp/yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Production extract tarball
run: mispipe "/tmp/production-extract-tarball 2>&1" ts
- name: Install production
run: |
sudo service rabbitmq-server restart
sudo mispipe "/tmp/production-install 2>&1" ts
- name: Verify install
run: sudo mispipe "/tmp/production-verify 2>&1" ts
- name: Upgrade postgresql
if: ${{ matrix.is_bionic }}
run: sudo mispipe "/tmp/production-upgrade-pg 2>&1" ts
- name: Verify install after upgrading postgresql
if: ${{ matrix.is_bionic }}
run: sudo mispipe "/tmp/production-verify 2>&1" ts

View File

@@ -1,158 +0,0 @@
name: Zulip CI
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
tests:
strategy:
fail-fast: false
matrix:
include:
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
- docker_image: mepriyank/actions:bionic
name: Ubuntu 18.04 Bionic (Python 3.6, backend + frontend)
os: bionic
is_bionic: true
include_frontend_tests: true
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/focal/Dockerfile
# Focal ships with Python 3.8.2.
- docker_image: mepriyank/actions:focal
name: Ubuntu 20.04 Focal (Python 3.8, backend)
os: focal
is_focal: true
include_frontend_tests: false
runs-on: ubuntu-latest
name: ${{ matrix.name }}
container: ${{ matrix.docker_image }}
env:
# GitHub Actions sets HOME to /github/home which causes
# problem later in provison and frontend test that runs
# tools/setup/postgres-init-dev-db because of the .pgpass
# location. Postgresql (psql) expects .pgpass to be at
# /home/github/.pgpass and setting home to `/home/github/`
# ensures it written there because we write it to ~/.pgpass.
HOME: /home/github/
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ matrix.os }}-${{ hashFiles('requirements/thumbor-dev.txt') }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ matrix.os }}
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Install dependencies
run: |
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
# Cleaning caches is mostly unnecessary in GitHub Actions, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
- name: Run backend tests
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/ci/backend 2>&1" ts
- name: Run frontend tests
if: ${{ matrix.include_frontend_tests }}
run: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
- name: Test locked requirements
if: ${{ matrix.is_bionic }}
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/test-locked-requirements 2>&1" ts
- name: Upload coverage reports
# Only upload coverage when both frontend and backend
# tests are ran.
if: ${{ matrix.include_frontend_tests }}
run: |
# Codcov requires `.coverage` file to be stored in the
# current working directory.
mv ./var/.coverage ./.coverage
. /srv/zulip-py3-venv/bin/activate || true
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov || echo "Error in uploading coverage reports to codecov.io."
- name: Store puppeteer artifacts
if: ${{ matrix.include_frontend_tests }}
uses: actions/upload-artifact@v2
with:
name: puppeteer
path: ./var/puppeteer
- name: Check development database build
if: ${{ matrix.is_focal }}
run: mispipe "tools/ci/setup-backend" ts
# TODO: We need to port the notify_failure step from CircleCI
# config, however, it might be the case that GitHub Notifications
# make this unnesscary. More details on settings to configure it:
# https://help.github.com/en/github/managing-subscriptions-and-notifications-on-github/configuring-notifications#github-actions-notification-options

14
.gitignore vendored
View File

@@ -29,16 +29,8 @@ package-lock.json
/.vagrant /.vagrant
/var /var
/.dmypy.json
# Dockerfiles generated for CircleCI # Dockerfiles generated for CircleCI
/tools/ci/images /tools/circleci/images
# Generated i18n data
/locale/en
/locale/language_options.json
/locale/language_name_map.json
/locale/*/mobile.json
# Static build # Static build
*.mo *.mo
@@ -48,7 +40,6 @@ npm-debug.log
/staticfiles.json /staticfiles.json
/webpack-stats-production.json /webpack-stats-production.json
/yarn-error.log /yarn-error.log
zulip-git-version
# Test / analysis tools # Test / analysis tools
.coverage .coverage
@@ -76,9 +67,6 @@ zulip.kdev4
.cache/ .cache/
.eslintcache .eslintcache
# Core dump files
core
## Miscellaneous ## Miscellaneous
# (Ideally this section is empty.) # (Ideally this section is empty.)
zthumbor/thumbor_local_settings.py zthumbor/thumbor_local_settings.py

View File

@@ -1,7 +1,10 @@
[settings] [settings]
src_paths = ., tools, tools/setup/emoji line_length = 79
multi_line_output = 3 multi_line_output = 2
known_third_party = zulip balanced_wrapping = true
include_trailing_comma = True known_third_party = django, ujson, sqlalchemy
use_parentheses = True known_first_party = zerver, zproject, version, confirmation, zilencer, analytics, frontend_tests, scripts, corporate
line_length = 100 sections = FUTURE, STDLIB, THIRDPARTY, FIRSTPARTY, LOCALFOLDER
lines_after_imports = 1
# See the comment related to ioloop_logging for why this is skipped.
skip = zerver/management/commands/runtornado.py

View File

@@ -1,39 +0,0 @@
Alex Vandiver <alexmv@zulip.com> <alex@chmrr.net>
Alex Vandiver <alexmv@zulip.com> <github@chmrr.net>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@humbughq.com>
Allen Rabinovich <allenrabinovich@yahoo.com> <allenr@zulip.com>
Aman Agrawal <amanagr@zulip.com> <f2016561@pilani.bits-pilani.ac.in>
Anders Kaseorg <anders@zulip.com> <anders@zulipchat.com>
Anders Kaseorg <anders@zulip.com> <andersk@mit.edu>
Brock Whittaker <brock@zulipchat.com> <bjwhitta@asu.edu>
Brock Whittaker <brock@zulipchat.com> <brockwhittaker@Brocks-MacBook.local>
Brock Whittaker <brock@zulipchat.com> <brock@zulipchat.org>
Chris Bobbe <cbobbe@zulip.com> <cbobbe@zulipchat.com>
Chris Bobbe <cbobbe@zulip.com> <csbobbe@gmail.com>
Greg Price <greg@zulip.com> <gnprice@gmail.com>
Greg Price <greg@zulip.com> <greg@zulipchat.com>
Greg Price <greg@zulip.com> <price@mit.edu>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@humbughq.com>
Jeff Arnold <jbarnold@gmail.com> <jbarnold@zulip.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@humbughq.com>
Jessica McKellar <jesstess@mit.edu> <jesstess@zulip.com>
Kevin Mehall <km@kevinmehall.net> <kevin@humbughq.com>
Kevin Mehall <km@kevinmehall.net> <kevin@zulip.com>
Ray Kraesig <rkraesig@zulip.com> <rkraesig@zulipchat.com>
Rishi Gupta <rishig@zulipchat.com> <rishig+git@mit.edu>
Rishi Gupta <rishig@zulipchat.com> <rishig@kandralabs.com>
Rishi Gupta <rishig@zulipchat.com> <rishig@users.noreply.github.com>
Reid Barton <rwbarton@gmail.com> <rwbarton@humbughq.com>
Scott Feeney <scott@oceanbase.org> <scott@humbughq.com>
Scott Feeney <scott@oceanbase.org> <scott@zulip.com>
Steve Howell <showell@zulip.com> <showell30@yahoo.com>
Steve Howell <showell@zulip.com> <showell@yahoo.com>
Steve Howell <showell@zulip.com> <showell@zulipchat.com>
Steve Howell <showell@zulip.com> <steve@humbughq.com>
Steve Howell <showell@zulip.com> <steve@zulip.com>
Tim Abbott <tabbott@zulip.com> <tabbott@dropbox.com>
Tim Abbott <tabbott@zulip.com> <tabbott@humbughq.com>
Tim Abbott <tabbott@zulip.com> <tabbott@mit.edu>
Tim Abbott <tabbott@zulip.com> <tabbott@zulipchat.com>
Vishnu KS <vishnu@zulip.com> <hackerkid@vishnuks.com>
Vishnu KS <vishnu@zulip.com> <yo@vishnuks.com>

View File

@@ -1,6 +0,0 @@
/corporate/tests/stripe_fixtures
/locale
/static/third
/tools/setup/emoji/emoji_map.json
/zerver/tests/fixtures
/zerver/webhooks/*/fixtures

View File

@@ -1,14 +0,0 @@
{
"source_directories": ["."],
"taint_models_path": [
"stubs/taint",
"zulip-py3-venv/lib/pyre_check/taint/"
],
"search_path": [
"stubs/",
"zulip-py3-venv/lib/pyre_check/stubs/"
],
"exclude": [
"/srv/zulip/zulip-py3-venv/.*"
]
}

View File

@@ -1 +0,0 @@
sonar.inclusions=**/*.py,**/*.html

67
.travis.yml Normal file
View File

@@ -0,0 +1,67 @@
# See https://zulip.readthedocs.io/en/latest/testing/travis.html for
# high-level documentation on our Travis CI setup.
dist: trusty
group: deprecated-2017Q4
install:
# Disable sometimes-broken sources.list in Travis base images
- sudo rm -vf /etc/apt/sources.list.d/*
- sudo apt-get update
# Disable Travis CI's built-in NVM installation
- mispipe "mv ~/.nvm ~/.travis-nvm-disabled" ts
# Install codecov, the library for the code coverage reporting tool we use
# With a retry to minimize impact of transient networking errors.
- mispipe "pip install codecov" ts || mispipe "pip install codecov" ts
# This is the main setup job for the test suite
- mispipe "tools/travis/setup-$TEST_SUITE" ts
# Clean any caches that are not in use to avoid our cache
# becoming huge.
- mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
script:
# We unset GEM_PATH here as a hack to work around Travis CI having
# broken running their system puppet with Ruby. See
# https://travis-ci.org/zulip/zulip/jobs/240120991 for an example traceback.
- unset GEM_PATH
- mispipe "./tools/travis/$TEST_SUITE" ts
cache:
yarn: true
apt: false
directories:
- $HOME/zulip-venv-cache
- $HOME/zulip-npm-cache
- $HOME/zulip-emoji-cache
- $HOME/node
- $HOME/misc
env:
global:
- BOTO_CONFIG=/tmp/nowhere
language: python
# Our test suites generally run on Python 3.4, the version in
# Ubuntu 14.04 trusty, which is the oldest OS release we support.
matrix:
include:
# Travis will actually run the jobs in the order they're listed here;
# that doesn't seem to be documented, but it's what we see empirically.
# We only get 4 jobs running at a time, so we try to make the first few
# the most likely to break.
- python: "3.4"
env: TEST_SUITE=production
# Other suites moved to CircleCI -- see .circleci/.
sudo: required
addons:
artifacts:
paths:
# Casper debugging data (screenshots, etc.) is super useful for
# debugging test flakes.
- $(ls var/casper/* | tr "\n" ":")
- $(ls /tmp/zulip-test-event-log/* | tr "\n" ":")
postgresql: "9.3"
apt:
packages:
- moreutils
after_success:
- codecov

View File

@@ -3,31 +3,31 @@ host = https://www.transifex.com
lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant
[zulip.djangopo] [zulip.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po source_file = static/locale/en/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en source_lang = en
type = PO type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip.translationsjson] [zulip.translationsjson]
file_filter = locale/<lang>/translations.json source_file = static/locale/en/translations.json
source_file = locale/en/translations.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json
[zulip.mobile] [zulip.messages]
file_filter = locale/<lang>/mobile.json source_file = static/locale/en/mobile.json
source_file = locale/en/mobile.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/mobile.json
[zulip-test.djangopo] [zulip-test.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po source_file = static/locale/en/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_lang = en source_lang = en
type = PO type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip-test.translationsjson] [zulip-test.translationsjson]
file_filter = locale/<lang>/translations.json source_file = static/locale/en/translations.json
source_file = locale/en/translations.json
source_lang = en source_lang = en
type = KEYVALUEJSON type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json

View File

@@ -1 +0,0 @@
ignore-scripts true

View File

@@ -14,7 +14,7 @@ This isn't an exhaustive list of things that you can't do. Rather, take it
in the spirit in which it's intended --- a guide to make it easier to enrich in the spirit in which it's intended --- a guide to make it easier to enrich
all of us and the technical communities in which we participate. all of us and the technical communities in which we participate.
## Expected behavior ## Expected Behavior
The following behaviors are expected and requested of all community members: The following behaviors are expected and requested of all community members:
@@ -29,7 +29,7 @@ The following behaviors are expected and requested of all community members:
* Community event venues may be shared with members of the public; be * Community event venues may be shared with members of the public; be
respectful to all patrons of these locations. respectful to all patrons of these locations.
## Unacceptable behavior ## Unacceptable Behavior
The following behaviors are considered harassment and are unacceptable The following behaviors are considered harassment and are unacceptable
within the Zulip community: within the Zulip community:
@@ -53,7 +53,7 @@ within the Zulip community:
presentations. presentations.
* Advocating for, or encouraging, any of the behaviors above. * Advocating for, or encouraging, any of the behaviors above.
## Reporting and enforcement ## Reporting and Enforcement
Harassment and other code of conduct violations reduce the value of the Harassment and other code of conduct violations reduce the value of the
community for everyone. If someone makes you or anyone else feel unsafe or community for everyone. If someone makes you or anyone else feel unsafe or
@@ -78,7 +78,7 @@ something you can do while a violation is happening, do it. A lot of the
harms of harassment and other violations can be mitigated by the victim harms of harassment and other violations can be mitigated by the victim
knowing that the other people present are on their side. knowing that the other people present are on their side.
All reports will be kept confidential. In some cases, we may determine that a All reports will be kept confidential. In some cases we may determine that a
public statement will need to be made. In such cases, the identities of all public statement will need to be made. In such cases, the identities of all
victims and reporters will remain confidential unless those individuals victims and reporters will remain confidential unless those individuals
instruct us otherwise. instruct us otherwise.
@@ -95,11 +95,11 @@ behavior occurring outside the scope of community activities when such
behavior has the potential to adversely affect the safety and well-being of behavior has the potential to adversely affect the safety and well-being of
community members. community members.
## License and attribution ## License and Attribution
This Code of Conduct is adapted from the This Code of Conduct is adapted from the
[Citizen Code of Conduct](http://citizencodeofconduct.org/) and the [Citizen Code of Conduct](http://citizencodeofconduct.org/) and the
[Django Code of Conduct](https://www.djangoproject.com/conduct/), and is [Django Code of Conduct](https://www.djangoproject.com/conduct/), and is
under a under a
[Creative Commons BY-SA](https://creativecommons.org/licenses/by-sa/4.0/) [Creative Commons BY-SA](http://creativecommons.org/licenses/by-sa/4.0/)
license. license.

View File

@@ -13,8 +13,7 @@ user, or anything else. Make sure to read the
before posting. The Zulip community is also governed by a before posting. The Zulip community is also governed by a
[code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html). [code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
You can subscribe to zulip-devel-announce@googlegroups.com or our You can subscribe to zulip-devel@googlegroups.com for a lower traffic (~1
[Twitter](https://twitter.com/zulip) account for a lower traffic (~1
email/month) way to hear about things like mentorship opportunities with Google email/month) way to hear about things like mentorship opportunities with Google
Code-in, in-person sprints at conferences, and other opportunities to Code-in, in-person sprints at conferences, and other opportunities to
contribute. contribute.
@@ -29,11 +28,11 @@ needs doing:
[backend](https://github.com/zulip/zulip), web [backend](https://github.com/zulip/zulip), web
[frontend](https://github.com/zulip/zulip), React Native [frontend](https://github.com/zulip/zulip), React Native
[mobile app](https://github.com/zulip/zulip-mobile), or Electron [mobile app](https://github.com/zulip/zulip-mobile), or Electron
[desktop app](https://github.com/zulip/zulip-desktop). [desktop app](https://github.com/zulip/zulip-electron).
* Building out our * Building out our
[Python API and bots](https://github.com/zulip/python-zulip-api) framework. [Python API and bots](https://github.com/zulip/python-zulip-api) framework.
* [Writing an integration](https://zulip.com/api/integrations-overview). * [Writing an integration](https://zulipchat.com/api/integration-guide).
* Improving our [user](https://zulip.com/help/) or * Improving our [user](https://zulipchat.com/help/) or
[developer](https://zulip.readthedocs.io/en/latest/) documentation. [developer](https://zulip.readthedocs.io/en/latest/) documentation.
* [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html) * [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html)
and manually testing pull requests. and manually testing pull requests.
@@ -47,7 +46,7 @@ don't require touching the codebase at all. We list a few of them below:
* [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html) * [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip. Zulip.
* [Outreach](#zulip-outreach): Star us on GitHub, upvote us * [Outreach](#zulip-outreach): Star us on GitHub, upvote us
on product comparison sites, or write for [the Zulip blog](https://blog.zulip.org/). on product comparison sites, or write for [the Zulip blog](http://blog.zulip.org/).
## Your first (codebase) contribution ## Your first (codebase) contribution
@@ -59,22 +58,23 @@ to help.
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html), [Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html),
paying special attention to the community norms. If you'd like, introduce paying special attention to the community norms. If you'd like, introduce
yourself in yourself in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members), using [#new members](https://chat.zulip.org/#narrow/stream/new.20members), using
your name as the topic. Bonus: tell us about your first impressions of your name as the topic. Bonus: tell us about your first impressions of
Zulip, and anything that felt confusing/broken as you started using the Zulip, and anything that felt confusing/broken as you started using the
product. product.
* Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor). * Read [What makes a great Zulip contributor](#what-makes-a-great-zulip-contributor).
* [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html), * [Install the development environment](https://zulip.readthedocs.io/en/latest/development/overview.html),
getting help in getting help in
[#development help](https://chat.zulip.org/#narrow/stream/49-development-help) [#development help](https://chat.zulip.org/#narrow/stream/development.20help)
if you run into any troubles. if you run into any troubles.
* Read the * Read the
[Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html) [Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html)
and do the Git tutorial (coming soon) if you are unfamiliar with and do the Git tutorial (coming soon) if you are unfamiliar with Git,
Git, getting help in getting help in
[#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if [#git help](https://chat.zulip.org/#narrow/stream/git.20help) if you run
you run into any troubles. Be sure to check out the into any troubles.
[extremely useful Zulip-specific tools page](https://zulip.readthedocs.io/en/latest/git/zulip-tools.html). * Sign the
[Dropbox Contributor License Agreement](https://opensource.dropbox.com/cla/).
### Picking an issue ### Picking an issue
@@ -84,53 +84,43 @@ on.
* If you're interested in * If you're interested in
[mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue), [mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue),
[desktop](https://github.com/zulip/zulip-desktop/issues?q=is%3Aopen+is%3Aissue), [desktop](https://github.com/zulip/zulip-electron/issues?q=is%3Aopen+is%3Aissue),
or or
[bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue) [bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue)
development, check the respective links for open issues, or post in development, check the respective links for open issues, or post in
[#mobile](https://chat.zulip.org/#narrow/stream/48-mobile), [#mobile](https://chat.zulip.org/#narrow/stream/mobile),
[#desktop](https://chat.zulip.org/#narrow/stream/16-desktop), or [#electron](https://chat.zulip.org/#narrow/stream/electron), or
[#integration](https://chat.zulip.org/#narrow/stream/127-integrations). [#bots](https://chat.zulip.org/#narrow/stream/bots).
* For the main server and web repository, we recommend browsing * For the main server and web repository, start by looking through issues
recently opened issues to look for issues you are confident you can with the label
fix correctly in a way that clearly communicates why your changes [good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A"good+first+issue").
are the correct fix. Our GitHub workflow bot, zulipbot, limits These are smaller projects particularly suitable for a first contribution.
users who have 0 commits merged to claiming a single issue labeled
with "good first issue" or "help wanted".
* We also partition all of our issues in the main repo into areas like * We also partition all of our issues in the main repo into areas like
admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look
through our [list of labels](https://github.com/zulip/zulip/labels), and through our [list of labels](https://github.com/zulip/zulip/labels), and
click on some of the `area:` labels to see all the issues related to your click on some of the `area:` labels to see all the issues related to your
areas of interest. areas of interest.
* If the lists of issues are overwhelming, post in * If the lists of issues are overwhelming, post in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with a [#new members](https://chat.zulip.org/#narrow/stream/new.20members) with a
bit about your background and interests, and we'll help you out. The most bit about your background and interests, and we'll help you out. The most
important thing to say is whether you're looking for a backend (Python), important thing to say is whether you're looking for a backend (Python),
frontend (JavaScript and TypeScript), mobile (React Native), desktop (Electron), frontend (JavaScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript/TypeScript + CSS) issue, and a documentation (English) or visual design (JavaScript + CSS) issue, and a
bit about your programming experience and available time. bit about your programming experience and available time.
We also welcome suggestions of features that you feel would be valuable or We also welcome suggestions of features that you feel would be valuable or
changes that you feel would make Zulip a better open source project. If you changes that you feel would make Zulip a better open source project. If you
have a new feature you'd like to add, we recommend you start by posting in have a new feature you'd like to add, we recommend you start by posting in
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with the [#new members](https://chat.zulip.org/#narrow/stream/new.20members) with the
feature idea and the problem that you're hoping to solve. feature idea and the problem that you're hoping to solve.
Other notes: Other notes:
* For a first pull request, it's better to aim for a smaller contribution * For a first pull request, it's better to aim for a smaller contribution
than a bigger one. Many first contributions have fewer than 10 lines of than a bigger one. Many first contributions have fewer than 10 lines of
changes (not counting changes to tests). changes (not counting changes to tests).
* The full list of issues explicitly looking for a contributor can be * The full list of issues looking for a contributor can be found with the
found with the
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22)
and
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22) [help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
labels. Avoid issues with the "difficult" label unless you label.
understand why it is difficult and are confident you can resolve the
issue correctly and completely. Issues without one of these labels
are fair game if Tim has written a clear technical design proposal
in the issue, or it is a bug that you can reproduce and you are
confident you can fix the issue correctly.
* For most new contributors, there's a lot to learn while making your first * For most new contributors, there's a lot to learn while making your first
pull request. It's OK if it takes you a while; that's normal! You'll be pull request. It's OK if it takes you a while; that's normal! You'll be
able to work a lot faster as you build experience. able to work a lot faster as you build experience.
@@ -142,12 +132,6 @@ the issue thread. [Zulipbot](https://github.com/zulip/zulipbot) is a GitHub
workflow bot; it will assign you to the issue and label the issue as "in workflow bot; it will assign you to the issue and label the issue as "in
progress". Some additional notes: progress". Some additional notes:
* You can only claim issues with the
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22)
or
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
labels. Zulipbot will give you an error if you try to claim an issue
without one of those labels.
* You're encouraged to ask questions on how to best implement or debug your * You're encouraged to ask questions on how to best implement or debug your
changes -- the Zulip maintainers are excited to answer questions to help changes -- the Zulip maintainers are excited to answer questions to help
you stay unblocked and working efficiently. You can ask questions on you stay unblocked and working efficiently. You can ask questions on
@@ -170,8 +154,9 @@ labels.
## What makes a great Zulip contributor? ## What makes a great Zulip contributor?
Zulip has a lot of experience working with new contributors. In our Zulip runs a lot of [internship programs](#internship-programs), so we have
experience, these are the best predictors of success: a lot of experience with new contributors. In our experience, these are the
best predictors of success:
* Posting good questions. This generally means explaining your current * Posting good questions. This generally means explaining your current
understanding, saying what you've done or tried so far, and including understanding, saying what you've done or tried so far, and including
@@ -191,8 +176,8 @@ experience, these are the best predictors of success:
able to address things within a few days. able to address things within a few days.
* Being helpful and friendly on chat.zulip.org. * Being helpful and friendly on chat.zulip.org.
These are also the main criteria we use to select candidates for all These are also the main criteria we use to select interns for all of our
of our outreach programs. internship programs.
## Reporting issues ## Reporting issues
@@ -201,9 +186,9 @@ bugs, feel free to just open an issue on the relevant project on GitHub.
If you have a feature request or are not yet sure what the underlying bug If you have a feature request or are not yet sure what the underlying bug
is, the best place to post issues is is, the best place to post issues is
[#issues](https://chat.zulip.org/#narrow/stream/9-issues) (or [#issues](https://chat.zulip.org/#narrow/stream/issues) (or
[#mobile](https://chat.zulip.org/#narrow/stream/48-mobile) or [#mobile](https://chat.zulip.org/#narrow/stream/mobile) or
[#desktop](https://chat.zulip.org/#narrow/stream/16-desktop)) on the [#electron](https://chat.zulip.org/#narrow/stream/electron)) on the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). [Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html).
This allows us to interactively figure out what is going on, let you know if This allows us to interactively figure out what is going on, let you know if
a similar issue has already been opened, and collect any other information a similar issue has already been opened, and collect any other information
@@ -213,9 +198,8 @@ and how to reproduce it if known, your browser/OS if relevant, and a
if appropriate. if appropriate.
**Reporting security issues**. Please do not report security issues **Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can publicly, including on public streams on chat.zulip.org. You can email
email security@zulip.com. We create a CVE for every security zulip-security@googlegroups.com. We create a CVE for every security issue.
issue in our released software.
## User feedback ## User feedback
@@ -230,7 +214,7 @@ to:
* Pros and cons: What are the pros and cons of Zulip for your organization, * Pros and cons: What are the pros and cons of Zulip for your organization,
and the pros and cons of other products you are evaluating? and the pros and cons of other products you are evaluating?
* Features: What are the features that are most important for your * Features: What are the features that are most important for your
organization? In the best-case scenario, what would your chat solution do organization? In the best case scenario, what would your chat solution do
for you? for you?
* Onboarding: If you remember it, what was your impression during your first * Onboarding: If you remember it, what was your impression during your first
few minutes of using Zulip? What did you notice, and how did you feel? Was few minutes of using Zulip? What did you notice, and how did you feel? Was
@@ -238,20 +222,21 @@ to:
* Organization: What does your organization do? How big is the organization? * Organization: What does your organization do? How big is the organization?
A link to your organization's website? A link to your organization's website?
## Outreach programs ## Internship programs
Zulip participates in [Google Summer of Code Zulip runs internship programs with
(GSoC)](https://developers.google.com/open-source/gsoc/) every year. [Outreachy](https://www.outreachy.org/),
In the past, we've also participated in [Google Summer of Code (GSoC)](https://developers.google.com/open-source/gsoc/)
[Outreachy](https://www.outreachy.org/), [Google [1], and the
Code-In](https://developers.google.com/open-source/gci/), and hosted [MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram),
summer interns from Harvard, MIT, and Stanford. and has in the past taken summer interns from Harvard, MIT, and
Stanford.
While each third-party program has its own rules and requirements, the While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in Zulip community's approaches all of these programs with these ideas in
mind: mind:
* We try to make the application process as valuable for the applicant as * We try to make the application process as valuable for the applicant as
possible. Expect high-quality code reviews, a supportive community, and possible. Expect high quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of publicly viewable patches you can link to from your resume, regardless of
whether you are selected. whether you are selected.
* To apply, you'll have to submit at least one pull request to a Zulip * To apply, you'll have to submit at least one pull request to a Zulip
@@ -265,22 +250,26 @@ mind:
application to makes mistakes in your first few PRs as long as your application to makes mistakes in your first few PRs as long as your
work improves. work improves.
Most of our outreach program participants end up sticking around the Zulip also participates in
project long-term, and many have become core team members, maintaining [Google Code-In](https://developers.google.com/open-source/gci/). Our
important parts of the project. We hope you apply! selection criteria for Finalists and Grand Prize Winners is the same as our
selection criteria for interns above.
Most of our interns end up sticking around the project long-term, and many
quickly become core team members. We hope you apply!
### Google Summer of Code ### Google Summer of Code
The largest outreach program Zulip participates in is GSoC (14 GSoC is by far the largest of our internship programs (we had 14 GSoC
students in 2017; 11 in 2018; 17 in 2019). While we don't control how students in summer 2017). While we don't control how many slots
many slots Google allocates to Zulip, we hope to mentor a similar Google allocates to Zulip, we hope to mentor a similar number of
number of students in future summers. students in 2018.
If you're reading this well before the application deadline and want If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is the best way to have and building a reputation for doing good work is best way to have a
a strong application. About half of Zulip's GSoC students for Summer strong application. About half of Zulip's GSoC students for Summer
2017 had made significant contributions to the project by February 2017 had made significant contributions to the project by February
2017, and about half had not. Our 2017, and about half had not. Our
[GSoC project ideas page][gsoc-guide] has lots more details on how [GSoC project ideas page][gsoc-guide] has lots more details on how
@@ -299,7 +288,11 @@ for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html [gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq [gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
## Zulip outreach [1] Formally, [GSoC isn't an internship][gsoc-faq], but it is similar
enough that we're treating it as such for the purposes of this
documentation.
## Zulip Outreach
**Upvoting Zulip**. Upvotes and reviews make a big difference in the public **Upvoting Zulip**. Upvotes and reviews make a big difference in the public
perception of projects like Zulip. We've collected a few sites below perception of projects like Zulip. We've collected a few sites below
@@ -308,7 +301,7 @@ list typically takes about 15 minutes.
* Star us on GitHub. There are four main repositories: * Star us on GitHub. There are four main repositories:
[server/web](https://github.com/zulip/zulip), [server/web](https://github.com/zulip/zulip),
[mobile](https://github.com/zulip/zulip-mobile), [mobile](https://github.com/zulip/zulip-mobile),
[desktop](https://github.com/zulip/zulip-desktop), and [desktop](https://github.com/zulip/zulip-electron), and
[Python API](https://github.com/zulip/python-zulip-api). [Python API](https://github.com/zulip/python-zulip-api).
* [Follow us](https://twitter.com/zulip) on Twitter. * [Follow us](https://twitter.com/zulip) on Twitter.
@@ -333,7 +326,7 @@ have been using Zulip for a while and want to contribute more.
about a technical aspect of Zulip can be a great way to spread the word about a technical aspect of Zulip can be a great way to spread the word
about Zulip. about Zulip.
We also occasionally [publish](https://blog.zulip.org/) long-form We also occasionally [publish](http://blog.zulip.org/) longer form
articles related to Zulip. Our posts typically get tens of thousands articles related to Zulip. Our posts typically get tens of thousands
of views, and we always have good ideas for blog posts that we can of views, and we always have good ideas for blog posts that we can
outline but don't have time to write. If you are an experienced writer outline but don't have time to write. If you are an experienced writer

17
Dockerfile-dev Normal file
View File

@@ -0,0 +1,17 @@
FROM ubuntu:trusty
EXPOSE 9991
RUN apt-get update && apt-get install -y wget
RUN locale-gen en_US.UTF-8
RUN useradd -d /home/zulip -m zulip && echo 'zulip ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers
USER zulip
RUN ln -nsf /srv/zulip ~/zulip
RUN echo 'export LC_ALL="en_US.UTF-8" LANG="en_US.UTF-8" LANGUAGE="en_US.UTF-8"' >> ~zulip/.bashrc
WORKDIR /srv/zulip

View File

@@ -1,15 +0,0 @@
# To build run `docker build -f Dockerfile-postgresql .` from the root of the
# zulip repo.
# Currently the postgres images do not support automatic upgrading of
# the on-disk data in volumes. So the base image can not currently be upgraded
# without users needing a manual pgdump and restore.
# Install hunspell, zulip stop words, and run zulip database
# init.
FROM groonga/pgroonga:latest-alpine-10-slim
RUN apk add -U --no-cache hunspell-en
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/local/share/postgresql/tsearch_data/zulip_english.stop
COPY scripts/setup/create-db.sql /docker-entrypoint-initdb.d/zulip-create-db.sql
COPY scripts/setup/create-pgroonga.sql /docker-entrypoint-initdb.d/zulip-create-pgroonga.sql

47
LICENSE
View File

@@ -1,4 +1,24 @@
Copyright 2011-2020 Dropbox, Inc., Kandra Labs, Inc., and contributors Copyright 2011-2017 Dropbox, Inc., Kandra Labs, Inc., and contributors
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
The software includes some works released by third parties under other
free and open source licenses. Those works are redistributed under the
license terms under which the works were received. For more details,
see the ``docs/THIRDPARTY`` file included with this distribution.
--------------------------------------------------------------------------------
Apache License Apache License
Version 2.0, January 2004 Version 2.0, January 2004
@@ -176,28 +196,3 @@ Copyright 2011-2020 Dropbox, Inc., Kandra Labs, Inc., and contributors
of your accepting any such warranty or additional liability. of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

16
NOTICE
View File

@@ -1,16 +0,0 @@
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this project except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
The software includes some works released by third parties under other
free and open source licenses. Those works are redistributed under the
license terms under which the works were received. For more details,
see the ``docs/THIRDPARTY`` file included with this distribution.

View File

@@ -5,14 +5,13 @@ immediacy of real-time chat with the productivity benefits of threaded
conversations. Zulip is used by open source projects, Fortune 500 companies, conversations. Zulip is used by open source projects, Fortune 500 companies,
large standards bodies, and others who need a real-time chat system that large standards bodies, and others who need a real-time chat system that
allows users to easily process hundreds or thousands of messages a day. With allows users to easily process hundreds or thousands of messages a day. With
over 500 contributors merging over 500 commits a month, Zulip is also the over 300 contributors merging over 500 commits a month, Zulip is also the
largest and fastest growing open source group chat project. largest and fastest growing open source group chat project.
[![CircleCI branch](https://img.shields.io/circleci/project/github/zulip/zulip/master.svg)](https://circleci.com/gh/zulip/zulip/tree/master) [![CircleCI Build Status](https://circleci.com/gh/zulip/zulip.svg?style=svg)](https://circleci.com/gh/zulip/zulip)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip/master.svg)](https://codecov.io/gh/zulip/zulip/branch/master) [![Travis Build Status](https://travis-ci.org/zulip/zulip.svg?branch=master)](https://travis-ci.org/zulip/zulip)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip.svg)](https://codecov.io/gh/zulip/zulip)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage] [![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage]
[![code style: prettier](https://img.shields.io/badge/code_style-prettier-ff69b4.svg)](https://github.com/prettier/prettier)
[![GitHub release](https://img.shields.io/github/release/zulip/zulip.svg)](https://github.com/zulip/zulip/releases/latest)
[![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/) [![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/)
[![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org) [![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org)
[![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip) [![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip)
@@ -30,12 +29,12 @@ You might be interested in:
* **Contributing code**. Check out our * **Contributing code**. Check out our
[guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html) [guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. Zulip prides itself on maintaining a clean and to get started. Zulip prides itself on maintaining a clean and
well-tested codebase, and a stock of hundreds of well-tested codebase, and a stock of hundreds of
[beginner-friendly issues][beginner-friendly]. [beginner-friendly issues][beginner-friendly].
* **Contributing non-code**. * **Contributing non-code**.
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issues), [Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issue),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip [translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip
into your language, into your language,
[write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach) [write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach)
@@ -52,26 +51,32 @@ You might be interested in:
the the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We [Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We
also recommend reading Zulip for also recommend reading Zulip for
[open source](https://zulip.com/for/open-source/), Zulip for [open source](https://zulipchat.com/for/open-source/), Zulip for
[companies](https://zulip.com/for/companies/), or Zulip for [companies](https://zulipchat.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulip.com/for/working-groups-and-communities/). [working groups and part time communities](https://zulipchat.com/for/working-groups-and-communities/).
* **Running a Zulip server**. Use a preconfigured [DigitalOcean droplet](https://marketplace.digitalocean.com/apps/zulip), * **Running a Zulip server**. Setting up a server takes just a couple of
[install Zulip](https://zulip.readthedocs.io/en/stable/production/install.html) minutes. Zulip runs on Ubuntu 16.04 Xenial and Ubuntu 14.04 Trusty. The
directly, or use Zulip's installation process is
experimental [Docker image](https://zulip.readthedocs.io/en/latest/production/deployment.html#zulip-in-docker). [documented here](https://zulip.readthedocs.io/en/1.7.1/prod.html).
Commercial support is available; see <https://zulip.com/plans> for details. Commercial support is available; see <https://zulipchat.com/plans> for
details.
* **Using Zulip without setting up a server**. <https://zulip.com> * **Using Zulip without setting up a server**. <https://zulipchat.com> offers
offers free and commercial hosting, including providing our paid free and commercial hosting.
plan for free to fellow open source projects.
* **Participating in [outreach * **Applying for a Zulip internship**. Zulip runs internship programs with
programs](https://zulip.readthedocs.io/en/latest/overview/contributing.html#outreach-programs)** [Outreachy](https://www.outreachy.org/),
like Google Summer of Code. [Google Summer of Code](https://developers.google.com/open-source/gsoc/),
and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram). Zulip
also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). More
information is available
[here](https://zulip.readthedocs.io/en/latest/overview/contributing.html#internship-programs).
You may also be interested in reading our [blog](https://blog.zulip.org/) or You may also be interested in reading our [blog](http://blog.zulip.org/) or
following us on [Twitter](https://twitter.com/zulip). following us on [twitter](https://twitter.com/zulip).
Zulip is distributed under the Zulip is distributed under the
[Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license. [Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license.

View File

@@ -1,28 +0,0 @@
# Security Policy
Security announcements are sent to zulip-announce@googlegroups.com,
so you should subscribe if you are running Zulip in production.
## Reporting a Vulnerability
We love responsible reports of (potential) security issues in Zulip,
whether in the latest release or our development branch.
Our security contact is security@zulip.com. Reporters should expect a
response within 24 hours.
Please include details on the issue and how you'd like to be credited
in our release notes when we publish the fix.
Our [security
model](https://zulip.readthedocs.io/en/latest/production/security-model.html)
document may be a helpful resource.
## Supported Versions
Zulip provides security support for the latest major release, in the
form of minor security/maintenance releases.
We work hard to make
[upgrades](https://zulip.readthedocs.io/en/latest/production/upgrade-or-modify.html#upgrading-to-a-release)
reliable, so that there's no reason to run older major releases.

132
Vagrantfile vendored
View File

@@ -19,6 +19,43 @@ if Vagrant::VERSION == "1.8.7" then
end end
end end
# Workaround: the lxc-config in vagrant-lxc is incompatible with changes in
# LXC 2.1.0, found in Ubuntu 17.10 artful. LXC 2.1.1 (in 18.04 LTS bionic)
# ignores the old config key, so this will only be needed for artful.
#
# vagrant-lxc upstream has an attempted fix:
# https://github.com/fgrehm/vagrant-lxc/issues/445
# but it didn't work in our testing. This is a temporary issue, so we just
# hack in a fix: we patch the skeleton `lxc-config` file right in the
# distribution of the vagrant-lxc "box" we use. If the user doesn't yet
# have the box (e.g. on first setup), Vagrant would download it but too
# late for us to patch it like this; so we prompt them to explicitly add it
# first and then rerun.
if ['up', 'provision'].include? ARGV[0]
if command? "lxc-ls"
LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if LXC_VERSION == "2.1.0"
lxc_config_file = ENV['HOME'] + "/.vagrant.d/boxes/fgrehm-VAGRANTSLASH-trusty64-lxc/1.2.0/lxc/lxc-config"
if File.file?(lxc_config_file)
lines = File.readlines(lxc_config_file)
deprecated_line = "lxc.pivotdir = lxc_putold\n"
if lines[1] == deprecated_line
lines[1] = "# #{deprecated_line}"
File.open(lxc_config_file, 'w') do |f|
f.puts(lines)
end
end
else
puts 'You are running LXC 2.1.0, and fgrehm/trusty64-lxc box is incompatible '\
"with it by default. First add the box by doing:\n"\
" vagrant box add https://vagrantcloud.com/fgrehm/trusty64-lxc\n"\
'Once this command succeeds, do "vagrant up" again.'
exit
end
end
end
end
# Workaround: Vagrant removed the atlas.hashicorp.com to # Workaround: Vagrant removed the atlas.hashicorp.com to
# vagrantcloud.com redirect in February 2018. The value of # vagrantcloud.com redirect in February 2018. The value of
# DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is # DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is
@@ -29,38 +66,24 @@ if Vagrant::DEFAULT_SERVER_URL == "atlas.hashicorp.com"
Vagrant::DEFAULT_SERVER_URL.replace('https://vagrantcloud.com') Vagrant::DEFAULT_SERVER_URL.replace('https://vagrantcloud.com')
end end
# Monkey patch https://github.com/hashicorp/vagrant/pull/10879 so we
# can fall back to another provider if docker is not installed.
begin
require Vagrant.source_root.join("plugins", "providers", "docker", "provider")
rescue LoadError
else
VagrantPlugins::DockerProvider::Provider.class_eval do
method(:usable?).owner == singleton_class or def self.usable?(raise_error=false)
VagrantPlugins::DockerProvider::Driver.new.execute("docker", "version")
true
rescue Vagrant::Errors::CommandUnavailable, VagrantPlugins::DockerProvider::Errors::ExecuteError
raise if raise_error
return false
end
end
end
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config| Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
# For LXC. VirtualBox hosts use a different box, described below.
config.vm.box = "fgrehm/trusty64-lxc"
# The Zulip development environment runs on 9991 on the guest. # The Zulip development environment runs on 9991 on the guest.
host_port = 9991 host_port = 9991
http_proxy = https_proxy = no_proxy = nil http_proxy = https_proxy = no_proxy = nil
host_ip_addr = "127.0.0.1" host_ip_addr = "127.0.0.1"
# System settings for the virtual machine.
vm_num_cpus = "2"
vm_memory = "2048"
ubuntu_mirror = ""
config.vm.synced_folder ".", "/vagrant", disabled: true config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.synced_folder ".", "/srv/zulip" if (/darwin/ =~ RUBY_PLATFORM) != nil
config.vm.synced_folder ".", "/srv/zulip", type: "nfs",
linux__nfs_options: ['rw']
config.vm.network "private_network", type: "dhcp"
else
config.vm.synced_folder ".", "/srv/zulip"
end
vagrant_config_file = ENV['HOME'] + "/.zulip-vagrant-config" vagrant_config_file = ENV['HOME'] + "/.zulip-vagrant-config"
if File.file?(vagrant_config_file) if File.file?(vagrant_config_file)
@@ -74,9 +97,6 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
when "NO_PROXY"; no_proxy = value when "NO_PROXY"; no_proxy = value
when "HOST_PORT"; host_port = value.to_i when "HOST_PORT"; host_port = value.to_i
when "HOST_IP_ADDR"; host_ip_addr = value when "HOST_IP_ADDR"; host_ip_addr = value
when "GUEST_CPUS"; vm_num_cpus = value
when "GUEST_MEMORY_MB"; vm_memory = value
when "UBUNTU_MIRROR"; ubuntu_mirror = value
end end
end end
end end
@@ -101,30 +121,32 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
end end
config.vm.network "forwarded_port", guest: 9991, host: host_port, host_ip: host_ip_addr config.vm.network "forwarded_port", guest: 9991, host: host_port, host_ip: host_ip_addr
config.vm.network "forwarded_port", guest: 9994, host: host_port + 3, host_ip: host_ip_addr # Specify LXC provider before VirtualBox provider so it's preferred.
# Specify Docker provider before VirtualBox provider so it's preferred. config.vm.provider "lxc" do |lxc|
config.vm.provider "docker" do |d, override| if command? "lxc-ls"
d.build_dir = File.join(__dir__, "tools", "setup", "dev-vagrant-docker") LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
d.build_args = ["--build-arg", "VAGRANT_UID=#{Process.uid}"] if LXC_VERSION >= "1.1.0"
if !ubuntu_mirror.empty? # Allow start without AppArmor, otherwise Box will not Start on Ubuntu 14.10
d.build_args += ["--build-arg", "UBUNTU_MIRROR=#{ubuntu_mirror}"] # see https://github.com/fgrehm/vagrant-lxc/issues/333
lxc.customize 'aa_allow_incomplete', 1
end
if LXC_VERSION >= "2.0.0"
lxc.backingstore = 'dir'
end
end end
d.has_ssh = true
d.create_args = ["--ulimit", "nofile=1024:65536"]
end end
config.vm.provider "virtualbox" do |vb, override| config.vm.provider "virtualbox" do |vb, override|
override.vm.box = "hashicorp/bionic64" override.vm.box = "ubuntu/trusty64"
# It's possible we can get away with just 1.5GB; more testing needed # It's possible we can get away with just 1.5GB; more testing needed
vb.memory = vm_memory vb.memory = 2048
vb.cpus = vm_num_cpus vb.cpus = 2
end end
config.vm.provider "parallels" do |prl, override| config.vm.provider "vmware_fusion" do |vb, override|
override.vm.box = "bento/ubuntu-18.04" override.vm.box = "puphpet/ubuntu1404-x64"
override.vm.box_version = "202005.21.0" vb.vmx["memsize"] = "2048"
prl.memory = vm_memory vb.vmx["numvcpus"] = "2"
prl.cpus = vm_num_cpus
end end
$provision_script = <<SCRIPT $provision_script = <<SCRIPT
@@ -136,15 +158,19 @@ set -o pipefail
# something that we don't want to happen when running provision in a # something that we don't want to happen when running provision in a
# development environment not using Vagrant. # development environment not using Vagrant.
# Set the Ubuntu mirror
[ ! '#{ubuntu_mirror}' ] || sudo sed -i 's|http://\\(\\w*\\.\\)*archive\\.ubuntu\\.com/ubuntu/\\? |#{ubuntu_mirror} |' /etc/apt/sources.list
# Set the MOTD on the system to have Zulip instructions # Set the MOTD on the system to have Zulip instructions
sudo ln -nsf /srv/zulip/tools/setup/dev-motd /etc/update-motd.d/99-zulip-dev sudo rm -f /etc/update-motd.d/*
sudo rm -f /etc/update-motd.d/10-help-text sudo bash -c 'cat << EndOfMessage > /etc/motd
sudo dpkg --purge landscape-client landscape-common ubuntu-release-upgrader-core update-manager-core update-notifier-common ubuntu-server Welcome to the Zulip development environment! Popular commands:
sudo dpkg-divert --add --rename /etc/default/motd-news * tools/provision - Update the development environment
sudo sh -c 'echo ENABLED=0 > /etc/default/motd-news' * tools/run-dev.py - Run the development server
* tools/lint - Run the linter (quick and catches many problmes)
* tools/test-* - Run tests (use --help to learn about options)
Read https://zulip.readthedocs.io/en/latest/testing/testing.html to learn
how to run individual test suites so that you can get a fast debug cycle.
EndOfMessage'
# If the host is running SELinux remount the /sys/fs/selinux directory as read only, # If the host is running SELinux remount the /sys/fs/selinux directory as read only,
# needed for apt-get to work. # needed for apt-get to work.
@@ -172,7 +198,7 @@ if [ ! -w /srv/zulip ]; then
# sudo is required since our uid is not 1000 # sudo is required since our uid is not 1000
echo ' vagrant halt -f' echo ' vagrant halt -f'
echo ' rm -rf /PATH/TO/ZULIP/CLONE/.vagrant' echo ' rm -rf /PATH/TO/ZULIP/CLONE/.vagrant'
echo ' sudo chown -R 1000:$(id -g) /PATH/TO/ZULIP/CLONE' echo ' sudo chown -R 1000:$(whoami) /PATH/TO/ZULIP/CLONE'
echo "Replace /PATH/TO/ZULIP/CLONE with the path to where zulip code is cloned." echo "Replace /PATH/TO/ZULIP/CLONE with the path to where zulip code is cloned."
echo "You can resume setting up your vagrant environment by running:" echo "You can resume setting up your vagrant environment by running:"
echo " vagrant up" echo " vagrant up"

View File

@@ -1,35 +1,22 @@
import logging
import time import time
from collections import OrderedDict, defaultdict from collections import OrderedDict, defaultdict
from datetime import datetime, timedelta from datetime import datetime, timedelta
from typing import Callable, Dict, Optional, Sequence, Tuple, Type, Union import logging
from typing import Any, Callable, Dict, List, \
Optional, Text, Tuple, Type, Union
from django.conf import settings from django.conf import settings
from django.db import connection from django.db import connection, models
from django.db.models import F from django.db.models import F
from psycopg2.sql import SQL, Composable, Identifier, Literal
from analytics.models import ( from analytics.models import Anomaly, BaseCount, \
BaseCount, FillState, InstallationCount, RealmCount, StreamCount, \
FillState, UserCount, installation_epoch, last_successful_fill
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
last_successful_fill,
)
from zerver.lib.logging_util import log_to_file from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, floor_to_hour, verify_UTC from zerver.lib.timestamp import ceiling_to_day, \
from zerver.models import ( ceiling_to_hour, floor_to_hour, verify_UTC
Message, from zerver.models import Message, Realm, RealmAuditLog, \
Realm, Stream, UserActivityInterval, UserProfile, models
RealmAuditLog,
Stream,
UserActivityInterval,
UserProfile,
models,
)
## Logging setup ## ## Logging setup ##
@@ -52,7 +39,7 @@ class CountStat:
self.data_collector = data_collector self.data_collector = data_collector
# might have to do something different for bitfields # might have to do something different for bitfields
if frequency not in self.FREQUENCIES: if frequency not in self.FREQUENCIES:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
self.frequency = frequency self.frequency = frequency
if interval is not None: if interval is not None:
self.interval = interval self.interval = interval
@@ -61,8 +48,8 @@ class CountStat:
else: # frequency == CountStat.DAY else: # frequency == CountStat.DAY
self.interval = timedelta(days=1) self.interval = timedelta(days=1)
def __str__(self) -> str: def __str__(self) -> Text:
return f"<CountStat: {self.property}>" return "<CountStat: %s>" % (self.property,)
class LoggingCountStat(CountStat): class LoggingCountStat(CountStat):
def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None: def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None:
@@ -70,39 +57,29 @@ class LoggingCountStat(CountStat):
class DependentCountStat(CountStat): class DependentCountStat(CountStat):
def __init__(self, property: str, data_collector: 'DataCollector', frequency: str, def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
interval: Optional[timedelta] = None, dependencies: Sequence[str] = []) -> None: interval: Optional[timedelta]=None, dependencies: List[str]=[]) -> None:
CountStat.__init__(self, property, data_collector, frequency, interval=interval) CountStat.__init__(self, property, data_collector, frequency, interval=interval)
self.dependencies = dependencies self.dependencies = dependencies
class DataCollector: class DataCollector:
def __init__(self, output_table: Type[BaseCount], def __init__(self, output_table: Type[BaseCount],
pull_function: Optional[Callable[[str, datetime, datetime, Optional[Realm]], int]]) -> None: pull_function: Optional[Callable[[str, datetime, datetime], int]]) -> None:
self.output_table = output_table self.output_table = output_table
self.pull_function = pull_function self.pull_function = pull_function
## CountStat-level operations ## ## CountStat-level operations ##
def process_count_stat(stat: CountStat, fill_to_time: datetime, def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
realm: Optional[Realm]=None) -> None:
# TODO: The realm argument is not yet supported, in that we don't
# have a solution for how to update FillState if it is passed. It
# exists solely as partial plumbing for when we do fully implement
# doing single-realm analytics runs for use cases like data import.
#
# Also, note that for the realm argument to be properly supported,
# the CountStat object passed in needs to have come from
# E.g. get_count_stats(realm), i.e. have the realm_id already
# entered into the SQL query defined by the CountState object.
if stat.frequency == CountStat.HOUR: if stat.frequency == CountStat.HOUR:
time_increment = timedelta(hours=1) time_increment = timedelta(hours=1)
elif stat.frequency == CountStat.DAY: elif stat.frequency == CountStat.DAY:
time_increment = timedelta(days=1) time_increment = timedelta(days=1)
else: else:
raise AssertionError(f"Unknown frequency: {stat.frequency}") raise AssertionError("Unknown frequency: %s" % (stat.frequency,))
verify_UTC(fill_to_time) verify_UTC(fill_to_time)
if floor_to_hour(fill_to_time) != fill_to_time: if floor_to_hour(fill_to_time) != fill_to_time:
raise ValueError(f"fill_to_time must be on an hour boundary: {fill_to_time}") raise ValueError("fill_to_time must be on an hour boundary: %s" % (fill_to_time,))
fill_state = FillState.objects.filter(property=stat.property).first() fill_state = FillState.objects.filter(property=stat.property).first()
if fill_state is None: if fill_state is None:
@@ -110,37 +87,37 @@ def process_count_stat(stat: CountStat, fill_to_time: datetime,
fill_state = FillState.objects.create(property=stat.property, fill_state = FillState.objects.create(property=stat.property,
end_time=currently_filled, end_time=currently_filled,
state=FillState.DONE) state=FillState.DONE)
logger.info("INITIALIZED %s %s", stat.property, currently_filled) logger.info("INITIALIZED %s %s" % (stat.property, currently_filled))
elif fill_state.state == FillState.STARTED: elif fill_state.state == FillState.STARTED:
logger.info("UNDO START %s %s", stat.property, fill_state.end_time) logger.info("UNDO START %s %s" % (stat.property, fill_state.end_time))
do_delete_counts_at_hour(stat, fill_state.end_time) do_delete_counts_at_hour(stat, fill_state.end_time)
currently_filled = fill_state.end_time - time_increment currently_filled = fill_state.end_time - time_increment
do_update_fill_state(fill_state, currently_filled, FillState.DONE) do_update_fill_state(fill_state, currently_filled, FillState.DONE)
logger.info("UNDO DONE %s", stat.property) logger.info("UNDO DONE %s" % (stat.property,))
elif fill_state.state == FillState.DONE: elif fill_state.state == FillState.DONE:
currently_filled = fill_state.end_time currently_filled = fill_state.end_time
else: else:
raise AssertionError(f"Unknown value for FillState.state: {fill_state.state}.") raise AssertionError("Unknown value for FillState.state: %s." % (fill_state.state,))
if isinstance(stat, DependentCountStat): if isinstance(stat, DependentCountStat):
for dependency in stat.dependencies: for dependency in stat.dependencies:
dependency_fill_time = last_successful_fill(dependency) dependency_fill_time = last_successful_fill(dependency)
if dependency_fill_time is None: if dependency_fill_time is None:
logger.warning("DependentCountStat %s run before dependency %s.", logger.warning("DependentCountStat %s run before dependency %s." %
stat.property, dependency) (stat.property, dependency))
return return
fill_to_time = min(fill_to_time, dependency_fill_time) fill_to_time = min(fill_to_time, dependency_fill_time)
currently_filled = currently_filled + time_increment currently_filled = currently_filled + time_increment
while currently_filled <= fill_to_time: while currently_filled <= fill_to_time:
logger.info("START %s %s", stat.property, currently_filled) logger.info("START %s %s" % (stat.property, currently_filled))
start = time.time() start = time.time()
do_update_fill_state(fill_state, currently_filled, FillState.STARTED) do_update_fill_state(fill_state, currently_filled, FillState.STARTED)
do_fill_count_stat_at_hour(stat, currently_filled, realm) do_fill_count_stat_at_hour(stat, currently_filled)
do_update_fill_state(fill_state, currently_filled, FillState.DONE) do_update_fill_state(fill_state, currently_filled, FillState.DONE)
end = time.time() end = time.time()
currently_filled = currently_filled + time_increment currently_filled = currently_filled + time_increment
logger.info("DONE %s (%dms)", stat.property, (end-start)*1000) logger.info("DONE %s (%dms)" % (stat.property, (end-start)*1000))
def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None: def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None:
fill_state.end_time = end_time fill_state.end_time = end_time
@@ -149,15 +126,15 @@ def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int)
# We assume end_time is valid (e.g. is on a day or hour boundary as appropriate) # We assume end_time is valid (e.g. is on a day or hour boundary as appropriate)
# and is timezone aware. It is the caller's responsibility to enforce this! # and is timezone aware. It is the caller's responsibility to enforce this!
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime, realm: Optional[Realm]=None) -> None: def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
start_time = end_time - stat.interval start_time = end_time - stat.interval
if not isinstance(stat, LoggingCountStat): if not isinstance(stat, LoggingCountStat):
timer = time.time() timer = time.time()
assert(stat.data_collector.pull_function is not None) assert(stat.data_collector.pull_function is not None)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time, realm) rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time)
logger.info("%s run pull_function (%dms/%sr)", logger.info("%s run pull_function (%dms/%sr)" %
stat.property, (time.time()-timer)*1000, rows_added) (stat.property, (time.time()-timer)*1000, rows_added))
do_aggregate_to_summary_table(stat, end_time, realm) do_aggregate_to_summary_table(stat, end_time)
def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None: def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
if isinstance(stat, LoggingCountStat): if isinstance(stat, LoggingCountStat):
@@ -170,76 +147,51 @@ def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
RealmCount.objects.filter(property=stat.property, end_time=end_time).delete() RealmCount.objects.filter(property=stat.property, end_time=end_time).delete()
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete() InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime, def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
realm: Optional[Realm]=None) -> None:
cursor = connection.cursor() cursor = connection.cursor()
# Aggregate into RealmCount # Aggregate into RealmCount
output_table = stat.data_collector.output_table output_table = stat.data_collector.output_table
if realm is not None:
realm_clause = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
else:
realm_clause = SQL("")
if output_table in (UserCount, StreamCount): if output_table in (UserCount, StreamCount):
realmcount_query = SQL(""" realmcount_query = """
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, COALESCE(sum({output_table}.value), 0), %(property)s, zerver_realm.id, COALESCE(sum(%(output_table)s.value), 0), '%(property)s',
{output_table}.subgroup, %(end_time)s %(output_table)s.subgroup, %%(end_time)s
FROM zerver_realm FROM zerver_realm
JOIN {output_table} JOIN %(output_table)s
ON ON
zerver_realm.id = {output_table}.realm_id zerver_realm.id = %(output_table)s.realm_id
WHERE WHERE
{output_table}.property = %(property)s AND %(output_table)s.property = '%(property)s' AND
{output_table}.end_time = %(end_time)s %(output_table)s.end_time = %%(end_time)s
{realm_clause} GROUP BY zerver_realm.id, %(output_table)s.subgroup
GROUP BY zerver_realm.id, {output_table}.subgroup """ % {'output_table': output_table._meta.db_table,
""").format( 'property': stat.property}
output_table=Identifier(output_table._meta.db_table),
realm_clause=realm_clause,
)
start = time.time() start = time.time()
cursor.execute(realmcount_query, { cursor.execute(realmcount_query, {'end_time': end_time})
'property': stat.property,
'end_time': end_time,
})
end = time.time() end = time.time()
logger.info( logger.info("%s RealmCount aggregation (%dms/%sr)" % (
"%s RealmCount aggregation (%dms/%sr)", stat.property, (end - start) * 1000, cursor.rowcount))
stat.property, (end - start) * 1000, cursor.rowcount,
)
if realm is None:
# Aggregate into InstallationCount. Only run if we just
# processed counts for all realms.
#
# TODO: Add support for updating installation data after
# changing an individual realm's values.
installationcount_query = SQL("""
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), %(property)s, analytics_realmcount.subgroup, %(end_time)s
FROM analytics_realmcount
WHERE
property = %(property)s AND
end_time = %(end_time)s
GROUP BY analytics_realmcount.subgroup
""")
start = time.time()
cursor.execute(installationcount_query, {
'property': stat.property,
'end_time': end_time,
})
end = time.time()
logger.info(
"%s InstallationCount aggregation (%dms/%sr)",
stat.property, (end - start) * 1000, cursor.rowcount,
)
# Aggregate into InstallationCount
installationcount_query = """
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), '%(property)s', analytics_realmcount.subgroup, %%(end_time)s
FROM analytics_realmcount
WHERE
property = '%(property)s' AND
end_time = %%(end_time)s
GROUP BY analytics_realmcount.subgroup
""" % {'property': stat.property}
start = time.time()
cursor.execute(installationcount_query, {'end_time': end_time})
end = time.time()
logger.info("%s InstallationCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
cursor.close() cursor.close()
## Utility functions called from outside counts.py ## ## Utility functions called from outside counts.py ##
@@ -248,9 +200,6 @@ def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime,
def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat, def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat,
subgroup: Optional[Union[str, int, bool]], event_time: datetime, subgroup: Optional[Union[str, int, bool]], event_time: datetime,
increment: int=1) -> None: increment: int=1) -> None:
if not increment:
return
table = stat.data_collector.output_table table = stat.data_collector.output_table
if table == RealmCount: if table == RealmCount:
id_args = {'realm': zerver_object} id_args = {'realm': zerver_object}
@@ -277,6 +226,7 @@ def do_drop_all_analytics_tables() -> None:
RealmCount.objects.all().delete() RealmCount.objects.all().delete()
InstallationCount.objects.all().delete() InstallationCount.objects.all().delete()
FillState.objects.all().delete() FillState.objects.all().delete()
Anomaly.objects.all().delete()
def do_drop_single_stat(property: str) -> None: def do_drop_single_stat(property: str) -> None:
UserCount.objects.filter(property=property).delete() UserCount.objects.filter(property=property).delete()
@@ -287,71 +237,46 @@ def do_drop_single_stat(property: str) -> None:
## DataCollector-level operations ## ## DataCollector-level operations ##
QueryFn = Callable[[Dict[str, Composable]], Composable] def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime, query: str,
group_by: Optional[Tuple[models.Model, str]]) -> int:
def do_pull_by_sql_query(
property: str,
start_time: datetime,
end_time: datetime,
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> int:
if group_by is None: if group_by is None:
subgroup = SQL('NULL') subgroup = 'NULL'
group_by_clause = SQL('') group_by_clause = ''
else: else:
subgroup = Identifier(group_by[0]._meta.db_table, group_by[1]) subgroup = '%s.%s' % (group_by[0]._meta.db_table, group_by[1])
group_by_clause = SQL(', {}').format(subgroup) group_by_clause = ', ' + subgroup
# We do string replacement here because cursor.execute will reject a # We do string replacement here because cursor.execute will reject a
# group_by_clause given as a param. # group_by_clause given as a param.
# We pass in the datetimes as params to cursor.execute so that we don't have to # We pass in the datetimes as params to cursor.execute so that we don't have to
# think about how to convert python datetimes to SQL datetimes. # think about how to convert python datetimes to SQL datetimes.
query_ = query({ query_ = query % {'property': property, 'subgroup': subgroup,
'subgroup': subgroup, 'group_by_clause': group_by_clause}
'group_by_clause': group_by_clause,
})
cursor = connection.cursor() cursor = connection.cursor()
cursor.execute(query_, { cursor.execute(query_, {'time_start': start_time, 'time_end': end_time})
'property': property,
'time_start': start_time,
'time_end': end_time,
})
rowcount = cursor.rowcount rowcount = cursor.rowcount
cursor.close() cursor.close()
return rowcount return rowcount
def sql_data_collector( def sql_data_collector(output_table: Type[BaseCount], query: str,
output_table: Type[BaseCount], group_by: Optional[Tuple[models.Model, str]]) -> DataCollector:
query: QueryFn, def pull_function(property: str, start_time: datetime, end_time: datetime) -> int:
group_by: Optional[Tuple[models.Model, str]],
) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime,
realm: Optional[Realm] = None) -> int:
# The pull function type needs to accept a Realm argument
# because the 'minutes_active::day' CountStat uses
# DataCollector directly for do_pull_minutes_active, which
# requires the realm argument. We ignore it here, because the
# realm should have been already encoded in the `query` we're
# passed.
return do_pull_by_sql_query(property, start_time, end_time, query, group_by) return do_pull_by_sql_query(property, start_time, end_time, query, group_by)
return DataCollector(output_table, pull_function) return DataCollector(output_table, pull_function)
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime, def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime) -> int:
realm: Optional[Realm] = None) -> int:
user_activity_intervals = UserActivityInterval.objects.filter( user_activity_intervals = UserActivityInterval.objects.filter(
end__gt=start_time, start__lt=end_time, end__gt=start_time, start__lt=end_time
).select_related( ).select_related(
'user_profile', 'user_profile'
).values_list( ).values_list(
'user_profile_id', 'user_profile__realm_id', 'start', 'end') 'user_profile_id', 'user_profile__realm_id', 'start', 'end')
seconds_active: Dict[Tuple[int, int], float] = defaultdict(float) seconds_active = defaultdict(float) # type: Dict[Tuple[int, int], float]
for user_id, realm_id, interval_start, interval_end in user_activity_intervals: for user_id, realm_id, interval_start, interval_end in user_activity_intervals:
if realm is None or realm.id == realm_id: start = max(start_time, interval_start)
start = max(start_time, interval_start) end = min(end_time, interval_end)
end = min(end_time, interval_end) seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
rows = [UserCount(user_id=ids[0], realm_id=ids[1], property=property, rows = [UserCount(user_id=ids[0], realm_id=ids[1], property=property,
end_time=end_time, value=int(seconds // 60)) end_time=end_time, value=int(seconds // 60))
@@ -359,39 +284,28 @@ def do_pull_minutes_active(property: str, start_time: datetime, end_time: dateti
UserCount.objects.bulk_create(rows) UserCount.objects.bulk_create(rows)
return len(rows) return len(rows)
def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn: count_message_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, count(*), zerver_userprofile.id, zerver_userprofile.realm_id, count(*),
%(property)s, {subgroup}, %(time_end)s '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile FROM zerver_userprofile
JOIN zerver_message JOIN zerver_message
ON ON
zerver_userprofile.id = zerver_message.sender_id zerver_userprofile.id = zerver_message.sender_id
WHERE WHERE
zerver_userprofile.date_joined < %(time_end)s AND zerver_userprofile.date_joined < %%(time_end)s AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s GROUP BY zerver_userprofile.id %(group_by_clause)s
GROUP BY zerver_userprofile.id {group_by_clause} """
""").format(**kwargs, realm_clause=realm_clause)
# Note: ignores the group_by / group_by_clause. # Note: ignores the group_by / group_by_clause.
def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn: count_message_type_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(realm_id, user_id, value, property, subgroup, end_time) (realm_id, user_id, value, property, subgroup, end_time)
SELECT realm_id, id, SUM(count) AS value, %(property)s, message_type, %(time_end)s SELECT realm_id, id, SUM(count) AS value, '%(property)s', message_type, %%(time_end)s
FROM FROM
( (
SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*), SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*),
@@ -409,9 +323,8 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
JOIN zerver_message JOIN zerver_message
ON ON
zerver_userprofile.id = zerver_message.sender_id AND zerver_userprofile.id = zerver_message.sender_id AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s
JOIN zerver_recipient JOIN zerver_recipient
ON ON
zerver_message.recipient_id = zerver_recipient.id zerver_message.recipient_id = zerver_recipient.id
@@ -423,22 +336,17 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
zerver_recipient.type, zerver_stream.invite_only zerver_recipient.type, zerver_stream.invite_only
) AS subquery ) AS subquery
GROUP BY realm_id, id, message_type GROUP BY realm_id, id, message_type
""").format(**kwargs, realm_clause=realm_clause) """
# This query joins to the UserProfile table since all current queries that # This query joins to the UserProfile table since all current queries that
# use this also subgroup on UserProfile.is_bot. If in the future there is a # use this also subgroup on UserProfile.is_bot. If in the future there is a
# stat that counts messages by stream and doesn't need the UserProfile # stat that counts messages by stream and doesn't need the UserProfile
# table, consider writing a new query for efficiency. # table, consider writing a new query for efficiency.
def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn: count_message_by_stream_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_stream.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_streamcount INSERT INTO analytics_streamcount
(stream_id, realm_id, value, property, subgroup, end_time) (stream_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_stream.id, zerver_stream.realm_id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_stream.id, zerver_stream.realm_id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_stream FROM zerver_stream
JOIN zerver_recipient JOIN zerver_recipient
ON ON
@@ -450,61 +358,48 @@ def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
ON ON
zerver_message.sender_id = zerver_userprofile.id zerver_message.sender_id = zerver_userprofile.id
WHERE WHERE
zerver_stream.date_created < %(time_end)s AND zerver_stream.date_created < %%(time_end)s AND
zerver_recipient.type = 2 AND zerver_recipient.type = 2 AND
zerver_message.date_sent >= %(time_start)s AND zerver_message.pub_date >= %%(time_start)s AND
{realm_clause} zerver_message.pub_date < %%(time_end)s
zerver_message.date_sent < %(time_end)s GROUP BY zerver_stream.id %(group_by_clause)s
GROUP BY zerver_stream.id {group_by_clause} """
""").format(**kwargs, realm_clause=realm_clause)
# Hardcodes the query needed by active_users:is_bot:day, since that is # Hardcodes the query needed by active_users:is_bot:day, since that is
# currently the only stat that uses this. # currently the only stat that uses this.
def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn: count_user_by_realm_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_realm.id, count(*),'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm FROM zerver_realm
JOIN zerver_userprofile JOIN zerver_userprofile
ON ON
zerver_realm.id = zerver_userprofile.realm_id zerver_realm.id = zerver_userprofile.realm_id
WHERE WHERE
zerver_realm.date_created < %(time_end)s AND zerver_realm.date_created < %%(time_end)s AND
zerver_userprofile.date_joined >= %(time_start)s AND zerver_userprofile.date_joined >= %%(time_start)s AND
zerver_userprofile.date_joined < %(time_end)s AND zerver_userprofile.date_joined < %%(time_end)s AND
{realm_clause}
zerver_userprofile.is_active = TRUE zerver_userprofile.is_active = TRUE
GROUP BY zerver_realm.id {group_by_clause} GROUP BY zerver_realm.id %(group_by_clause)s
""").format(**kwargs, realm_clause=realm_clause) """
# Currently hardcodes the query needed for active_users_audit:is_bot:day. # Currently hardcodes the query needed for active_users_audit:is_bot:day.
# Assumes that a user cannot have two RealmAuditLog entries with the same event_time and # Assumes that a user cannot have two RealmAuditLog entries with the same event_time and
# event_type in [RealmAuditLog.USER_CREATED, USER_DEACTIVATED, etc]. # event_type in ['user_created', 'user_deactivated', etc].
# In particular, it's important to ensure that migrations don't cause that to happen. # In particular, it's important to ensure that migrations don't cause that to happen.
def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn: check_realmauditlog_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
ral1.modified_user_id, ral1.realm_id, 1, %(property)s, {subgroup}, %(time_end)s ral1.modified_user_id, ral1.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realmauditlog ral1 FROM zerver_realmauditlog ral1
JOIN ( JOIN (
SELECT modified_user_id, max(event_time) AS max_event_time SELECT modified_user_id, max(event_time) AS max_event_time
FROM zerver_realmauditlog FROM zerver_realmauditlog
WHERE WHERE
event_type in ({user_created}, {user_activated}, {user_deactivated}, {user_reactivated}) AND event_type in ('user_created', 'user_deactivated', 'user_activated', 'user_reactivated') AND
{realm_clause} event_time < %%(time_end)s
event_time < %(time_end)s
GROUP BY modified_user_id GROUP BY modified_user_id
) ral2 ) ral2
ON ON
@@ -514,185 +409,128 @@ def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
ON ON
ral1.modified_user_id = zerver_userprofile.id ral1.modified_user_id = zerver_userprofile.id
WHERE WHERE
ral1.event_type in ({user_created}, {user_activated}, {user_reactivated}) ral1.event_type in ('user_created', 'user_activated', 'user_reactivated')
""").format( """
**kwargs,
user_created=Literal(RealmAuditLog.USER_CREATED),
user_activated=Literal(RealmAuditLog.USER_ACTIVATED),
user_deactivated=Literal(RealmAuditLog.USER_DEACTIVATED),
user_reactivated=Literal(RealmAuditLog.USER_REACTIVATED),
realm_clause=realm_clause,
)
def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn: check_useractivityinterval_by_user_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_usercount INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time) (user_id, realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, 1, %(property)s, {subgroup}, %(time_end)s zerver_userprofile.id, zerver_userprofile.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile FROM zerver_userprofile
JOIN zerver_useractivityinterval JOIN zerver_useractivityinterval
ON ON
zerver_userprofile.id = zerver_useractivityinterval.user_profile_id zerver_userprofile.id = zerver_useractivityinterval.user_profile_id
WHERE WHERE
zerver_useractivityinterval.end >= %(time_start)s AND zerver_useractivityinterval.end >= %%(time_start)s AND
{realm_clause} zerver_useractivityinterval.start < %%(time_end)s
zerver_useractivityinterval.start < %(time_end)s GROUP BY zerver_userprofile.id %(group_by_clause)s
GROUP BY zerver_userprofile.id {group_by_clause} """
""").format(**kwargs, realm_clause=realm_clause)
def count_realm_active_humans_query(realm: Optional[Realm]) -> QueryFn: count_realm_active_humans_query = """
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
usercount1.realm_id, count(*), %(property)s, NULL, %(time_end)s usercount1.realm_id, count(*), '%(property)s', NULL, %%(time_end)s
FROM ( FROM (
SELECT realm_id, user_id SELECT realm_id, user_id
FROM analytics_usercount FROM analytics_usercount
WHERE WHERE
property = 'active_users_audit:is_bot:day' AND property = 'active_users_audit:is_bot:day' AND
subgroup = 'false' AND subgroup = 'false' AND
{realm_clause} end_time = %%(time_end)s
end_time = %(time_end)s
) usercount1 ) usercount1
JOIN ( JOIN (
SELECT realm_id, user_id SELECT realm_id, user_id
FROM analytics_usercount FROM analytics_usercount
WHERE WHERE
property = '15day_actives::day' AND property = '15day_actives::day' AND
{realm_clause} end_time = %%(time_end)s
end_time = %(time_end)s
) usercount2 ) usercount2
ON ON
usercount1.user_id = usercount2.user_id usercount1.user_id = usercount2.user_id
GROUP BY usercount1.realm_id GROUP BY usercount1.realm_id
""").format(**kwargs, realm_clause=realm_clause) """
# Currently unused and untested # Currently unused and untested
count_stream_by_realm_query = lambda kwargs: SQL(""" count_stream_by_realm_query = """
INSERT INTO analytics_realmcount INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time) (realm_id, value, property, subgroup, end_time)
SELECT SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s zerver_realm.id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm FROM zerver_realm
JOIN zerver_stream JOIN zerver_stream
ON ON
zerver_realm.id = zerver_stream.realm_id AND zerver_realm.id = zerver_stream.realm_id AND
WHERE WHERE
zerver_realm.date_created < %(time_end)s AND zerver_realm.date_created < %%(time_end)s AND
zerver_stream.date_created >= %(time_start)s AND zerver_stream.date_created >= %%(time_start)s AND
zerver_stream.date_created < %(time_end)s zerver_stream.date_created < %%(time_end)s
GROUP BY zerver_realm.id {group_by_clause} GROUP BY zerver_realm.id %(group_by_clause)s
""").format(**kwargs) """
def get_count_stats(realm: Optional[Realm]=None) -> Dict[str, CountStat]: ## CountStat declarations ##
## CountStat declarations ##
count_stats_ = [ count_stats_ = [
# Messages Sent stats # Messages Sent stats
# Stats that count the number of messages sent in various ways. # Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table. # These are also the set of stats that read from the Message table.
CountStat('messages_sent:is_bot:hour', CountStat('messages_sent:is_bot:hour',
sql_data_collector(UserCount, count_message_by_user_query( sql_data_collector(UserCount, count_message_by_user_query, (UserProfile, 'is_bot')),
realm), (UserProfile, 'is_bot')), CountStat.HOUR),
CountStat.HOUR), CountStat('messages_sent:message_type:day',
CountStat('messages_sent:message_type:day', sql_data_collector(UserCount, count_message_type_by_user_query, None), CountStat.DAY),
sql_data_collector( CountStat('messages_sent:client:day',
UserCount, count_message_type_by_user_query(realm), None), sql_data_collector(UserCount, count_message_by_user_query, (Message, 'sending_client_id')),
CountStat.DAY), CountStat.DAY),
CountStat('messages_sent:client:day', CountStat('messages_in_stream:is_bot:day',
sql_data_collector(UserCount, count_message_by_user_query(realm), sql_data_collector(StreamCount, count_message_by_stream_query, (UserProfile, 'is_bot')),
(Message, 'sending_client_id')), CountStat.DAY), CountStat.DAY),
CountStat('messages_in_stream:is_bot:day',
sql_data_collector(StreamCount, count_message_by_stream_query(realm),
(UserProfile, 'is_bot')), CountStat.DAY),
# Number of Users stats # Number of Users stats
# Stats that count the number of active users in the UserProfile.is_active sense. # Stats that count the number of active users in the UserProfile.is_active sense.
# 'active_users_audit:is_bot:day' is the canonical record of which users were # 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense). # active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected. # Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat('active_users_audit:is_bot:day', CountStat('active_users_audit:is_bot:day',
sql_data_collector(UserCount, check_realmauditlog_by_user_query( sql_data_collector(UserCount, check_realmauditlog_by_user_query, (UserProfile, 'is_bot')),
realm), (UserProfile, 'is_bot')), CountStat.DAY),
CountStat.DAY), # Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced.
LoggingCountStat('active_users_log:is_bot:day', RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query, (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
# Important note: LoggingCountStat objects aren't passed the # User Activity stats
# Realm argument, because by nature they have a logging # Stats that measure user activity in the UserActivityInterval sense.
# structure, not a pull-from-database structure, so there's no
# way to compute them for a single realm after the fact (the
# use case for passing a Realm argument).
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats. CountStat('15day_actives::day',
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
# sum sequence of 'active_users_log:is_bot:day', for any realm that CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
# started after the latter stat was introduced. CountStat('minutes_active::day', DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY),
LoggingCountStat('active_users_log:is_bot:day',
RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query(realm), (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
# Messages read stats. messages_read::hour is the total # Rate limiting stats
# number of messages read, whereas
# messages_read_interactions::hour tries to count the total
# number of UI interactions resulting in messages being marked
# as read (imperfect because of batching of some request
# types, but less likely to be overwhelmed by a single bulk
# operation).
LoggingCountStat('messages_read::hour', UserCount, CountStat.HOUR),
LoggingCountStat('messages_read_interactions::hour', UserCount, CountStat.HOUR),
# User Activity stats # Used to limit the number of invitation emails sent by a realm
# Stats that measure user activity in the UserActivityInterval sense. LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
CountStat('1day_actives::day', # Dependent stats
sql_data_collector( # Must come after their dependencies.
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('7day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=7)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('15day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(
UserCount, do_pull_minutes_active), CountStat.DAY),
# Rate limiting stats # Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(RealmCount, count_realm_active_humans_query, None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day'])
]
# Used to limit the number of invitation emails sent by a realm COUNT_STATS = OrderedDict([(stat.property, stat) for stat in count_stats_])
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
# Dependent stats
# Must come after their dependencies.
# Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(
RealmCount, count_realm_active_humans_query(realm), None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day']),
]
return OrderedDict([(stat.property, stat) for stat in count_stats_])
# To avoid refactoring for now COUNT_STATS can be used as before
COUNT_STATS = get_count_stats()

View File

@@ -4,7 +4,6 @@ from typing import List
from analytics.lib.counts import CountStat from analytics.lib.counts import CountStat
def generate_time_series_data(days: int=100, business_hours_base: float=10, def generate_time_series_data(days: int=100, business_hours_base: float=10,
non_business_hours_base: float=10, growth: float=1, non_business_hours_base: float=10, growth: float=1,
autocorrelation: float=0, spikiness: float=1, autocorrelation: float=0, spikiness: float=1,
@@ -44,10 +43,10 @@ def generate_time_series_data(days: int=100, business_hours_base: float=10,
[24*non_business_hours_base] * 2 [24*non_business_hours_base] * 2
holidays = [random() < holiday_rate for i in range(days)] holidays = [random() < holiday_rate for i in range(days)]
else: else:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
if length < 2: if length < 2:
raise AssertionError("Must be generating at least 2 data points. " raise AssertionError("Must be generating at least 2 data points. "
f"Currently generating {length}") "Currently generating %s" % (length,))
growth_base = growth ** (1. / (length-1)) growth_base = growth ** (1. / (length-1))
values_no_noise = [seasonality[i % len(seasonality)] * (growth_base**i) for i in range(length)] values_no_noise = [seasonality[i % len(seasonality)] * (growth_base**i) for i in range(length)]

View File

@@ -4,7 +4,6 @@ from typing import List, Optional
from analytics.lib.counts import CountStat from analytics.lib.counts import CountStat
from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC
# If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive. # If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive.
# If min_length is greater than 0, pads the list to the left. # If min_length is greater than 0, pads the list to the left.
# So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22], # So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22],
@@ -20,7 +19,7 @@ def time_range(start: datetime, end: datetime, frequency: str,
end = floor_to_day(end) end = floor_to_day(end)
step = timedelta(days=1) step = timedelta(days=1)
else: else:
raise AssertionError(f"Unknown frequency: {frequency}") raise AssertionError("Unknown frequency: %s" % (frequency,))
times = [] times = []
if min_length is not None: if min_length is not None:

View File

@@ -0,0 +1,81 @@
import datetime
import logging
import time
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Message, Recipient
def compute_stats(log_level: int) -> None:
logger = logging.getLogger()
logger.setLevel(log_level)
one_week_ago = timestamp_to_datetime(time.time()) - datetime.timedelta(weeks=1)
mit_query = Message.objects.filter(sender__realm__string_id="zephyr",
recipient__type=Recipient.STREAM,
pub_date__gt=one_week_ago)
for bot_sender_start in ["imap.", "rcmd.", "sys."]:
mit_query = mit_query.exclude(sender__email__startswith=(bot_sender_start))
# Filtering for "/" covers tabbott/extra@ and all the daemon/foo bots.
mit_query = mit_query.exclude(sender__email__contains=("/"))
mit_query = mit_query.exclude(sender__email__contains=("aim.com"))
mit_query = mit_query.exclude(
sender__email__in=["rss@mit.edu", "bash@mit.edu", "apache@mit.edu",
"bitcoin@mit.edu", "lp@mit.edu", "clocks@mit.edu",
"root@mit.edu", "nagios@mit.edu",
"www-data|local-realm@mit.edu"])
user_counts = {} # type: Dict[str, Dict[str, int]]
for m in mit_query.select_related("sending_client", "sender"):
email = m.sender.email
user_counts.setdefault(email, {})
user_counts[email].setdefault(m.sending_client.name, 0)
user_counts[email][m.sending_client.name] += 1
total_counts = {} # type: Dict[str, int]
total_user_counts = {} # type: Dict[str, int]
for email, counts in user_counts.items():
total_user_counts.setdefault(email, 0)
for client_name, count in counts.items():
total_counts.setdefault(client_name, 0)
total_counts[client_name] += count
total_user_counts[email] += count
logging.debug("%40s | %10s | %s" % ("User", "Messages", "Percentage Zulip"))
top_percents = {} # type: Dict[int, float]
for size in [10, 25, 50, 100, 200, len(total_user_counts.keys())]:
top_percents[size] = 0.0
for i, email in enumerate(sorted(total_user_counts.keys(),
key=lambda x: -total_user_counts[x])):
percent_zulip = round(100 - (user_counts[email].get("zephyr_mirror", 0)) * 100. /
total_user_counts[email], 1)
for size in top_percents.keys():
top_percents.setdefault(size, 0)
if i < size:
top_percents[size] += (percent_zulip * 1.0 / size)
logging.debug("%40s | %10s | %s%%" % (email, total_user_counts[email],
percent_zulip))
logging.info("")
for size in sorted(top_percents.keys()):
logging.info("Top %6s | %s%%" % (size, round(top_percents[size], 1)))
grand_total = sum(total_counts.values())
print(grand_total)
logging.info("%15s | %s" % ("Client", "Percentage"))
for client in total_counts.keys():
logging.info("%15s | %s%%" % (client, round(100. * total_counts[client] / grand_total, 1)))
class Command(BaseCommand):
help = "Compute statistics on MIT Zephyr usage."
def add_arguments(self, parser: CommandParser) -> None:
parser.add_argument('--verbose', default=False, action='store_true')
def handle(self, *args: Any, **options: Any) -> None:
level = logging.INFO
if options["verbose"]:
level = logging.DEBUG
compute_stats(level)

View File

@@ -0,0 +1,56 @@
import datetime
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from django.utils.timezone import utc
from zerver.lib.statistics import seconds_usage_between
from zerver.models import UserProfile
def analyze_activity(options: Dict[str, Any]) -> None:
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=utc)
day_end = day_start + datetime.timedelta(days=options["duration"])
user_profile_query = UserProfile.objects.all()
if options["realm"]:
user_profile_query = user_profile_query.filter(realm__string_id=options["realm"])
print("Per-user online duration:\n")
total_duration = datetime.timedelta(0)
for user_profile in user_profile_query:
duration = seconds_usage_between(user_profile, day_start, day_end)
if duration == datetime.timedelta(0):
continue
total_duration += duration
print("%-*s%s" % (37, user_profile.email, duration,))
print("\nTotal Duration: %s" % (total_duration,))
print("\nTotal Duration in minutes: %s" % (total_duration.total_seconds() / 60.,))
print("Total Duration amortized to a month: %s" % (total_duration.total_seconds() * 30. / 60.,))
class Command(BaseCommand):
help = """Report analytics of user activity on a per-user and realm basis.
This command aggregates user activity data that is collected by each user using Zulip. It attempts
to approximate how much each user has been using Zulip per day, measured by recording each 15 minute
period where some activity has occurred (mouse move or keyboard activity).
It will correctly not count server-initiated reloads in the activity statistics.
The duration flag can be used to control how many days to show usage duration for
Usage: ./manage.py analyze_user_activity [--realm=zulip] [--date=2013-09-10] [--duration=1]
By default, if no date is selected 2013-09-10 is used. If no realm is provided, information
is shown for all realms"""
def add_arguments(self, parser: CommandParser) -> None:
parser.add_argument('--realm', action='store')
parser.add_argument('--date', action='store', default="2013-09-06")
parser.add_argument('--duration', action='store', default=1, type=int,
help="How many days to show usage information for")
def handle(self, *args: Any, **options: Any) -> None:
analyze_activity(options)

View File

@@ -1,21 +1,27 @@
import os from argparse import ArgumentParser
import time
from datetime import timedelta from datetime import timedelta
from typing import Any, Dict
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from analytics.models import InstallationCount, installation_epoch, \
last_successful_fill
from analytics.lib.counts import COUNT_STATS, CountStat from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.models import installation_epoch, last_successful_fill from zerver.lib.timestamp import floor_to_hour, floor_to_day, verify_UTC, \
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day, floor_to_hour, verify_UTC TimezoneNotUTCException
from zerver.models import Realm from zerver.models import Realm
import os
import subprocess
import sys
import time
from typing import Any, Dict
states = { states = {
0: "OK", 0: "OK",
1: "WARNING", 1: "WARNING",
2: "CRITICAL", 2: "CRITICAL",
3: "UNKNOWN", 3: "UNKNOWN"
} }
class Command(BaseCommand): class Command(BaseCommand):
@@ -32,8 +38,9 @@ class Command(BaseCommand):
state_file_tmp = state_file_path + "-tmp" state_file_tmp = state_file_path + "-tmp"
with open(state_file_tmp, "w") as f: with open(state_file_tmp, "w") as f:
f.write(f"{int(time.time())}|{status}|{states[status]}|{message}\n") f.write("%s|%s|%s|%s\n" % (
os.rename(state_file_tmp, state_file_path) int(time.time()), status, states[status], message))
subprocess.check_call(["mv", state_file_tmp, state_file_path])
def get_fill_state(self) -> Dict[str, Any]: def get_fill_state(self) -> Dict[str, Any]:
if not Realm.objects.exists(): if not Realm.objects.exists():
@@ -48,7 +55,7 @@ class Command(BaseCommand):
try: try:
verify_UTC(last_fill) verify_UTC(last_fill)
except TimezoneNotUTCException: except TimezoneNotUTCException:
return {'status': 2, 'message': f'FillState not in UTC for {property}'} return {'status': 2, 'message': 'FillState not in UTC for %s' % (property,)}
if stat.frequency == CountStat.DAY: if stat.frequency == CountStat.DAY:
floor_function = floor_to_day floor_function = floor_to_day
@@ -60,7 +67,8 @@ class Command(BaseCommand):
critical_threshold = timedelta(minutes=150) critical_threshold = timedelta(minutes=150)
if floor_function(last_fill) != last_fill: if floor_function(last_fill) != last_fill:
return {'status': 2, 'message': f'FillState not on {stat.frequency} boundary for {property}'} return {'status': 2, 'message': 'FillState not on %s boundary for %s' %
(stat.frequency, property)}
time_to_last_fill = timezone_now() - last_fill time_to_last_fill = timezone_now() - last_fill
if time_to_last_fill > critical_threshold: if time_to_last_fill > critical_threshold:
@@ -71,16 +79,7 @@ class Command(BaseCommand):
if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0: if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0:
return {'status': 0, 'message': 'FillState looks fine.'} return {'status': 0, 'message': 'FillState looks fine.'}
if len(critical_unfilled_properties) == 0: if len(critical_unfilled_properties) == 0:
return { return {'status': 1, 'message': 'Missed filling %s once.' %
'status': 1, (', '.join(warning_unfilled_properties),)}
'message': 'Missed filling {} once.'.format( return {'status': 2, 'message': 'Missed filling %s once. Missed filling %s at least twice.' %
', '.join(warning_unfilled_properties), (', '.join(warning_unfilled_properties), ', '.join(critical_unfilled_properties))}
),
}
return {
'status': 2,
'message': 'Missed filling {} once. Missed filling {} at least twice.'.format(
', '.join(warning_unfilled_properties),
', '.join(critical_unfilled_properties),
),
}

View File

@@ -1,11 +1,11 @@
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from analytics.lib.counts import do_drop_all_analytics_tables from analytics.lib.counts import do_drop_all_analytics_tables
class Command(BaseCommand): class Command(BaseCommand):
help = """Clear analytics tables.""" help = """Clear analytics tables."""
@@ -18,4 +18,5 @@ class Command(BaseCommand):
if options['force']: if options['force']:
do_drop_all_analytics_tables() do_drop_all_analytics_tables()
else: else:
raise CommandError("Would delete all data from analytics tables (!); use --force to do so.") print("Would delete all data from analytics tables (!); use --force to do so.")
sys.exit(1)

View File

@@ -1,11 +1,11 @@
import sys
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from analytics.lib.counts import COUNT_STATS, do_drop_single_stat from analytics.lib.counts import COUNT_STATS, do_drop_single_stat
class Command(BaseCommand): class Command(BaseCommand):
help = """Clear analytics tables.""" help = """Clear analytics tables."""
@@ -20,8 +20,10 @@ class Command(BaseCommand):
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
property = options['property'] property = options['property']
if property not in COUNT_STATS: if property not in COUNT_STATS:
raise CommandError(f"Invalid property: {property}") print("Invalid property: %s" % (property,))
sys.exit(1)
if not options['force']: if not options['force']:
raise CommandError("No action taken. Use --force.") print("No action taken. Use --force.")
sys.exit(1)
do_drop_single_stat(property) do_drop_single_stat(property)

View File

@@ -0,0 +1,73 @@
import datetime
from argparse import ArgumentParser
from typing import Any
from django.db.models import Count, QuerySet
from django.utils.timezone import now as timezone_now
from zerver.lib.management import ZulipBaseCommand
from zerver.models import UserActivity
class Command(ZulipBaseCommand):
help = """Report rough client activity globally, for a realm, or for a user
Usage examples:
./manage.py client_activity --target server
./manage.py client_activity --target realm --realm zulip
./manage.py client_activity --target user --user hamlet@zulip.com --realm zulip"""
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('--target', dest='target', required=True, type=str,
help="'server' will calculate client activity of the entire server. "
"'realm' will calculate client activity of realm. "
"'user' will calculate client activity of the user.")
parser.add_argument('--user', dest='user', type=str,
help="The email address of the user you want to calculate activity.")
self.add_realm_args(parser)
def compute_activity(self, user_activity_objects: QuerySet) -> None:
# Report data from the past week.
#
# This is a rough report of client activity because we inconsistently
# register activity from various clients; think of it as telling you
# approximately how many people from a group have used a particular
# client recently. For example, this might be useful to get a sense of
# how popular different versions of a desktop client are.
#
# Importantly, this does NOT tell you anything about the relative
# volumes of requests from clients.
threshold = timezone_now() - datetime.timedelta(days=7)
client_counts = user_activity_objects.filter(
last_visit__gt=threshold).values("client__name").annotate(
count=Count('client__name'))
total = 0
counts = []
for client_type in client_counts:
count = client_type["count"]
client = client_type["client__name"]
total += count
counts.append((count, client))
counts.sort()
for count in counts:
print("%25s %15d" % (count[1], count[0]))
print("Total:", total)
def handle(self, *args: Any, **options: str) -> None:
realm = self.get_realm(options)
if options["user"] is None:
if options["target"] == "server" and realm is None:
# Report global activity.
self.compute_activity(UserActivity.objects.all())
elif options["target"] == "realm" and realm is not None:
self.compute_activity(UserActivity.objects.filter(user_profile__realm=realm))
else:
self.print_help("./manage.py", "client_activity")
elif options["target"] == "user":
user_profile = self.get_user(options["user"], realm)
self.compute_activity(UserActivity.objects.filter(user_profile=user_profile))
else:
self.print_help("./manage.py", "client_activity")

View File

@@ -1,26 +1,18 @@
from datetime import timedelta
from typing import Any, Dict, List, Mapping, Optional, Type from datetime import datetime, timedelta
from unittest import mock from typing import Any, Dict, List, Mapping, Optional, Text, Type, Union
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat, do_drop_all_analytics_tables from analytics.lib.counts import COUNT_STATS, \
CountStat, do_drop_all_analytics_tables
from analytics.lib.fixtures import generate_time_series_data from analytics.lib.fixtures import generate_time_series_data
from analytics.lib.time_utils import time_range from analytics.lib.time_utils import time_range
from analytics.models import ( from analytics.models import BaseCount, FillState, RealmCount, UserCount, StreamCount
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
)
from zerver.lib.actions import STREAM_ASSIGNMENT_COLORS, do_change_user_role
from zerver.lib.create_user import create_user
from zerver.lib.timestamp import floor_to_day from zerver.lib.timestamp import floor_to_day
from zerver.models import Client, Realm, Recipient, Stream, Subscription, UserProfile from zerver.models import Realm, UserProfile, Stream, Message, Client, \
RealmAuditLog, Recipient
class Command(BaseCommand): class Command(BaseCommand):
help = """Populates analytics tables with randomly generated data.""" help = """Populates analytics tables with randomly generated data."""
@@ -28,6 +20,20 @@ class Command(BaseCommand):
DAYS_OF_DATA = 100 DAYS_OF_DATA = 100
random_seed = 26 random_seed = 26
def create_user(self, email: Text,
full_name: Text,
is_staff: bool,
date_joined: datetime,
realm: Realm) -> UserProfile:
user = UserProfile.objects.create(
email=email, full_name=full_name, is_staff=is_staff,
realm=realm, short_name=full_name, pointer=-1, last_pointer_updater='none',
api_key='42', date_joined=date_joined)
RealmAuditLog.objects.create(
realm=realm, modified_user=user, event_type='user_created',
event_time=user.date_joined)
return user
def generate_fixture_data(self, stat: CountStat, business_hours_base: float, def generate_fixture_data(self, stat: CountStat, business_hours_base: float,
non_business_hours_base: float, growth: float, non_business_hours_base: float, growth: float,
autocorrelation: float, spikiness: float, autocorrelation: float, spikiness: float,
@@ -40,57 +46,24 @@ class Command(BaseCommand):
frequency=stat.frequency, partial_sum=partial_sum, random_seed=self.random_seed) frequency=stat.frequency, partial_sum=partial_sum, random_seed=self.random_seed)
def handle(self, *args: Any, **options: Any) -> None: def handle(self, *args: Any, **options: Any) -> None:
# TODO: This should arguably only delete the objects
# associated with the "analytics" realm.
do_drop_all_analytics_tables() do_drop_all_analytics_tables()
# I believe this also deletes any objects with this realm as a foreign key
# This also deletes any objects with this realm as a foreign key
Realm.objects.filter(string_id='analytics').delete() Realm.objects.filter(string_id='analytics').delete()
# Because we just deleted a bunch of objects in the database
# directly (rather than deleting individual objects in Django,
# in which case our post_save hooks would have flushed the
# individual objects from memcached for us), we need to flush
# memcached in order to ensure deleted objects aren't still
# present in the memcached cache.
from zerver.apps import flush_cache
flush_cache(None)
installation_time = timezone_now() - timedelta(days=self.DAYS_OF_DATA) installation_time = timezone_now() - timedelta(days=self.DAYS_OF_DATA)
last_end_time = floor_to_day(timezone_now()) last_end_time = floor_to_day(timezone_now())
realm = Realm.objects.create( realm = Realm.objects.create(
string_id='analytics', name='Analytics', date_created=installation_time) string_id='analytics', name='Analytics', date_created=installation_time)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=installation_time): shylock = self.create_user('shylock@analytics.ds', 'Shylock', True, installation_time, realm)
shylock = create_user(
'shylock@analytics.ds',
'Shylock',
realm,
full_name='Shylock',
role=UserProfile.ROLE_REALM_ADMINISTRATOR
)
do_change_user_role(shylock, UserProfile.ROLE_REALM_ADMINISTRATOR, acting_user=None)
stream = Stream.objects.create( stream = Stream.objects.create(
name='all', realm=realm, date_created=installation_time) name='all', realm=realm, date_created=installation_time)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM) Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
# Subscribe shylock to the stream to avoid invariant failures.
# TODO: This should use subscribe_users_to_streams from populate_db.
subs = [
Subscription(recipient=recipient,
user_profile=shylock,
color=STREAM_ASSIGNMENT_COLORS[0]),
]
Subscription.objects.bulk_create(subs)
def insert_fixture_data(stat: CountStat, def insert_fixture_data(stat: CountStat,
fixture_data: Mapping[Optional[str], List[int]], fixture_data: Mapping[Optional[str], List[int]],
table: Type[BaseCount]) -> None: table: Type[BaseCount]) -> None:
end_times = time_range(last_end_time, last_end_time, stat.frequency, end_times = time_range(last_end_time, last_end_time, stat.frequency,
len(list(fixture_data.values())[0])) len(list(fixture_data.values())[0]))
if table == InstallationCount:
id_args: Dict[str, Any] = {}
if table == RealmCount: if table == RealmCount:
id_args = {'realm': realm} id_args = {'realm': realm}
if table == UserCount: if table == UserCount:
@@ -104,67 +77,21 @@ class Command(BaseCommand):
value=value, **id_args) value=value, **id_args)
for end_time, value in zip(end_times, values) if value != 0]) for end_time, value in zip(end_times, values) if value != 0])
stat = COUNT_STATS['1day_actives::day']
realm_data: Mapping[Optional[str], List[int]] = {
None: self.generate_fixture_data(stat, .08, .02, 3, .3, 6, partial_sum=True),
}
insert_fixture_data(stat, realm_data, RealmCount)
installation_data: Mapping[Optional[str], List[int]] = {
None: self.generate_fixture_data(stat, .8, .2, 4, .3, 6, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['7day_actives::day']
realm_data = {
None: self.generate_fixture_data(stat, .2, .07, 3, .3, 6, partial_sum=True),
}
insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
None: self.generate_fixture_data(stat, 2, .7, 4, .3, 6, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['realm_active_humans::day'] stat = COUNT_STATS['realm_active_humans::day']
realm_data = { realm_data = {
None: self.generate_fixture_data(stat, .8, .08, 3, .5, 3, partial_sum=True), None: self.generate_fixture_data(stat, .1, .03, 3, .5, 3, partial_sum=True),
} } # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
None: self.generate_fixture_data(stat, 1, .3, 4, .5, 3, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['active_users_audit:is_bot:day']
realm_data = {
'false': self.generate_fixture_data(stat, 1, .2, 3.5, .8, 2, partial_sum=True),
'true': self.generate_fixture_data(stat, .3, .05, 3, .3, 2, partial_sum=True),
}
insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
'false': self.generate_fixture_data(stat, 3, 1, 4, .8, 2, partial_sum=True),
'true': self.generate_fixture_data(stat, 1, .4, 4, .8, 2, partial_sum=True),
}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time, FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE) state=FillState.DONE)
stat = COUNT_STATS['messages_sent:is_bot:hour'] stat = COUNT_STATS['messages_sent:is_bot:hour']
user_data: Mapping[Optional[str], List[int]] = { user_data = {'false': self.generate_fixture_data(
'false': self.generate_fixture_data(stat, 2, 1, 1.5, .6, 8, holiday_rate=.1), stat, 2, 1, 1.5, .6, 8, holiday_rate=.1)} # type: Mapping[Optional[str], List[int]]
}
insert_fixture_data(stat, user_data, UserCount) insert_fixture_data(stat, user_data, UserCount)
realm_data = {'false': self.generate_fixture_data(stat, 35, 15, 6, .6, 4), realm_data = {'false': self.generate_fixture_data(stat, 35, 15, 6, .6, 4),
'true': self.generate_fixture_data(stat, 15, 15, 3, .4, 2)} 'true': self.generate_fixture_data(stat, 15, 15, 3, .4, 2)}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {'false': self.generate_fixture_data(stat, 350, 150, 6, .6, 4),
'true': self.generate_fixture_data(stat, 150, 150, 3, .4, 2)}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time, FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE) state=FillState.DONE)
@@ -180,12 +107,6 @@ class Command(BaseCommand):
'private_message': self.generate_fixture_data(stat, 13, 5, 5, .6, 4), 'private_message': self.generate_fixture_data(stat, 13, 5, 5, .6, 4),
'huddle_message': self.generate_fixture_data(stat, 6, 3, 3, .6, 4)} 'huddle_message': self.generate_fixture_data(stat, 6, 3, 3, .6, 4)}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
'public_stream': self.generate_fixture_data(stat, 300, 80, 5, .6, 4),
'private_stream': self.generate_fixture_data(stat, 70, 70, 5, .6, 4),
'private_message': self.generate_fixture_data(stat, 130, 50, 5, .6, 4),
'huddle_message': self.generate_fixture_data(stat, 60, 30, 3, .6, 4)}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time, FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE) state=FillState.DONE)
@@ -215,17 +136,6 @@ class Command(BaseCommand):
unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0), unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0),
long_webhook.id: self.generate_fixture_data(stat, 5, 5, 2, .6, 3)} long_webhook.id: self.generate_fixture_data(stat, 5, 5, 2, .6, 3)}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
installation_data = {
website.id: self.generate_fixture_data(stat, 300, 200, 5, .6, 3),
old_desktop.id: self.generate_fixture_data(stat, 50, 30, 8, .6, 3),
android.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3),
iOS.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3),
react_native.id: self.generate_fixture_data(stat, 5, 5, 10, .6, 3),
API.id: self.generate_fixture_data(stat, 50, 50, 5, .6, 3),
zephyr_mirror.id: self.generate_fixture_data(stat, 10, 10, 3, .6, 3),
unused.id: self.generate_fixture_data(stat, 0, 0, 0, 0, 0),
long_webhook.id: self.generate_fixture_data(stat, 50, 50, 2, .6, 3)}
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time, FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE) state=FillState.DONE)
@@ -233,22 +143,8 @@ class Command(BaseCommand):
realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4), realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4),
'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)} 'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)}
insert_fixture_data(stat, realm_data, RealmCount) insert_fixture_data(stat, realm_data, RealmCount)
stream_data: Mapping[Optional[str], List[int]] = { stream_data = {'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4), 'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2)} # type: Mapping[Optional[str], List[int]]
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2),
}
insert_fixture_data(stat, stream_data, StreamCount) insert_fixture_data(stat, stream_data, StreamCount)
FillState.objects.create(property=stat.property, end_time=last_end_time, FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE) state=FillState.DONE)
stat = COUNT_STATS['messages_read::hour']
user_data = {
None: self.generate_fixture_data(stat, 7, 3, 2, .6, 8, holiday_rate=.1),
}
insert_fixture_data(stat, user_data, UserCount)
realm_data = {
None: self.generate_fixture_data(stat, 50, 35, 6, .6, 4)
}
insert_fixture_data(stat, realm_data, RealmCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)

View File

@@ -0,0 +1,153 @@
import datetime
from argparse import ArgumentParser
from typing import Any, List
import pytz
from django.core.management.base import BaseCommand
from django.db.models import Count
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Recipient, Stream, \
Subscription, UserActivity, UserMessage, UserProfile, get_realm
MOBILE_CLIENT_LIST = ["Android", "ios"]
HUMAN_CLIENT_LIST = MOBILE_CLIENT_LIST + ["website"]
human_messages = Message.objects.filter(sending_client__name__in=HUMAN_CLIENT_LIST)
class Command(BaseCommand):
help = "Generate statistics on realm activity."
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def active_users(self, realm: Realm) -> List[UserProfile]:
# Has been active (on the website, for now) in the last 7 days.
activity_cutoff = timezone_now() - datetime.timedelta(days=7)
return [activity.user_profile for activity in (
UserActivity.objects.filter(user_profile__realm=realm,
user_profile__is_active=True,
last_visit__gt=activity_cutoff,
query="/json/users/me/pointer",
client__name="website"))]
def messages_sent_by(self, user: UserProfile, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender=user, pub_date__gt=sent_time_cutoff).count()
def total_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return Message.objects.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def human_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def api_messages(self, realm: Realm, days_ago: int) -> int:
return (self.total_messages(realm, days_ago) - self.human_messages(realm, days_ago))
def stream_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff,
recipient__type=Recipient.STREAM).count()
def private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.HUDDLE).count()
def group_private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.PERSONAL).count()
def report_percentage(self, numerator: float, denominator: float, text: str) -> None:
if not denominator:
fraction = 0.0
else:
fraction = numerator / float(denominator)
print("%.2f%% of" % (fraction * 100,), text)
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
active_users = self.active_users(realm)
num_active = len(active_users)
print("%d active users (%d total)" % (num_active, len(user_profiles)))
streams = Stream.objects.filter(realm=realm).extra(
tables=['zerver_subscription', 'zerver_recipient'],
where=['zerver_subscription.recipient_id = zerver_recipient.id',
'zerver_recipient.type = 2',
'zerver_recipient.type_id = zerver_stream.id',
'zerver_subscription.active = true']).annotate(count=Count("name"))
print("%d streams" % (streams.count(),))
for days_ago in (1, 7, 30):
print("In last %d days, users sent:" % (days_ago,))
sender_quantities = [self.messages_sent_by(user, days_ago) for user in user_profiles]
for quantity in sorted(sender_quantities, reverse=True):
print(quantity, end=' ')
print("")
print("%d stream messages" % (self.stream_messages(realm, days_ago),))
print("%d one-on-one private messages" % (self.private_messages(realm, days_ago),))
print("%d messages sent via the API" % (self.api_messages(realm, days_ago),))
print("%d group private messages" % (self.group_private_messages(realm, days_ago),))
num_notifications_enabled = len([x for x in active_users if x.enable_desktop_notifications])
self.report_percentage(num_notifications_enabled, num_active,
"active users have desktop notifications enabled")
num_enter_sends = len([x for x in active_users if x.enter_sends])
self.report_percentage(num_enter_sends, num_active,
"active users have enter-sends")
all_message_count = human_messages.filter(sender__realm=realm).count()
multi_paragraph_message_count = human_messages.filter(
sender__realm=realm, content__contains="\n\n").count()
self.report_percentage(multi_paragraph_message_count, all_message_count,
"all messages are multi-paragraph")
# Starred messages
starrers = UserMessage.objects.filter(user_profile__in=user_profiles,
flags=UserMessage.flags.starred).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users have starred %d messages" % (
len(starrers), sum([elt["count"] for elt in starrers])))
active_user_subs = Subscription.objects.filter(
user_profile__in=user_profiles, active=True)
# Streams not in home view
non_home_view = active_user_subs.filter(in_home_view=False).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users have %d streams not in home view" % (
len(non_home_view), sum([elt["count"] for elt in non_home_view])))
# Code block markup
markup_messages = human_messages.filter(
sender__realm=realm, content__contains="~~~").values(
"sender").annotate(count=Count("sender"))
print("%d users have used code block markup on %s messages" % (
len(markup_messages), sum([elt["count"] for elt in markup_messages])))
# Notifications for stream messages
notifications = active_user_subs.filter(notifications=True).values(
"user_profile").annotate(count=Count("user_profile"))
print("%d users receive desktop notifications for %d streams" % (
len(notifications), sum([elt["count"] for elt in notifications])))
print("")

View File

@@ -1,11 +1,11 @@
from argparse import ArgumentParser from argparse import ArgumentParser
from typing import Any from typing import Any
from django.core.management.base import BaseCommand, CommandError from django.core.management.base import BaseCommand
from django.db.models import Q from django.db.models import Q
from zerver.models import Message, Realm, Recipient, Stream, Subscription, get_realm from zerver.models import Message, Realm, \
Recipient, Stream, Subscription, get_realm
class Command(BaseCommand): class Command(BaseCommand):
help = "Generate statistics on the streams for a realm." help = "Generate statistics on the streams for a realm."
@@ -19,38 +19,26 @@ class Command(BaseCommand):
try: try:
realms = [get_realm(string_id) for string_id in options['realms']] realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e: except Realm.DoesNotExist as e:
raise CommandError(e) print(e)
exit(1)
else: else:
realms = Realm.objects.all() realms = Realm.objects.all()
for realm in realms: for realm in realms:
print(realm.string_id)
print("------------")
print("%25s %15s %10s" % ("stream", "subscribers", "messages"))
streams = Stream.objects.filter(realm=realm).exclude(Q(name__istartswith="tutorial-")) streams = Stream.objects.filter(realm=realm).exclude(Q(name__istartswith="tutorial-"))
# private stream count invite_only_count = 0
private_count = 0
# public stream count
public_count = 0
for stream in streams: for stream in streams:
if stream.invite_only: if stream.invite_only:
private_count += 1 invite_only_count += 1
else: continue
public_count += 1 print("%25s" % (stream.name,), end=' ')
print("------------")
print(realm.string_id, end=' ')
print("{:>10} {} public streams and".format("(", public_count), end=' ')
print(f"{private_count} private streams )")
print("------------")
print("{:>25} {:>15} {:>10} {:>12}".format("stream", "subscribers", "messages", "type"))
for stream in streams:
if stream.invite_only:
stream_type = 'private'
else:
stream_type = 'public'
print(f"{stream.name:>25}", end=' ')
recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id) recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id)
print("{:10}".format(len(Subscription.objects.filter(recipient=recipient, print("%10d" % (len(Subscription.objects.filter(recipient=recipient,
active=True))), end=' ') active=True)),), end=' ')
num_messages = len(Message.objects.filter(recipient=recipient)) num_messages = len(Message.objects.filter(recipient=recipient))
print(f"{num_messages:12}", end=' ') print("%12d" % (num_messages,))
print(f"{stream_type:>15}") print("%d invite-only streams" % (invite_only_count,))
print("") print("")

View File

@@ -1,21 +1,19 @@
import os import os
import time import time
from argparse import ArgumentParser from argparse import ArgumentParser
from datetime import timezone
from typing import Any, Dict from typing import Any, Dict
from django.conf import settings from django.conf import settings
from django.core.management.base import BaseCommand from django.core.management.base import BaseCommand
from django.utils.dateparse import parse_datetime from django.utils.dateparse import parse_datetime
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import COUNT_STATS, logger, process_count_stat from analytics.lib.counts import COUNT_STATS, logger, process_count_stat
from scripts.lib.zulip_tools import ENDC, WARNING from scripts.lib.zulip_tools import ENDC, WARNING
from zerver.lib.remote_server import send_analytics_to_remote_server
from zerver.lib.timestamp import floor_to_hour from zerver.lib.timestamp import floor_to_hour
from zerver.models import Realm from zerver.models import Realm
class Command(BaseCommand): class Command(BaseCommand):
help = """Fills Analytics tables. help = """Fills Analytics tables.
@@ -60,18 +58,18 @@ class Command(BaseCommand):
fill_to_time = parse_datetime(options['time']) fill_to_time = parse_datetime(options['time'])
if options['utc']: if options['utc']:
fill_to_time = fill_to_time.replace(tzinfo=timezone.utc) fill_to_time = fill_to_time.replace(tzinfo=timezone_utc)
if fill_to_time.tzinfo is None: if fill_to_time.tzinfo is None:
raise ValueError("--time must be timezone aware. Maybe you meant to use the --utc option?") raise ValueError("--time must be timezone aware. Maybe you meant to use the --utc option?")
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone.utc)) fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone_utc))
if options['stat'] is not None: if options['stat'] is not None:
stats = [COUNT_STATS[options['stat']]] stats = [COUNT_STATS[options['stat']]]
else: else:
stats = list(COUNT_STATS.values()) stats = list(COUNT_STATS.values())
logger.info("Starting updating analytics counts through %s", fill_to_time) logger.info("Starting updating analytics counts through %s" % (fill_to_time,))
if options['verbose']: if options['verbose']:
start = time.time() start = time.time()
last = start last = start
@@ -79,12 +77,10 @@ class Command(BaseCommand):
for stat in stats: for stat in stats:
process_count_stat(stat, fill_to_time) process_count_stat(stat, fill_to_time)
if options['verbose']: if options['verbose']:
print(f"Updated {stat.property} in {time.time() - last:.3f}s") print("Updated %s in %.3fs" % (stat.property, time.time() - last))
last = time.time() last = time.time()
if options['verbose']: if options['verbose']:
print(f"Finished updating analytics counts through {fill_to_time} in {time.time() - start:.3f}s") print("Finished updating analytics counts through %s in %.3fs" %
logger.info("Finished updating analytics counts through %s", fill_to_time) (fill_to_time, time.time() - start))
logger.info("Finished updating analytics counts through %s" % (fill_to_time,))
if settings.PUSH_NOTIFICATION_BOUNCER_URL and settings.SUBMIT_USAGE_STATISTICS:
send_analytics_to_remote_server()

View File

@@ -0,0 +1,42 @@
import datetime
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Stream, UserProfile, get_realm
class Command(BaseCommand):
help = "Generate statistics on user activity."
def add_arguments(self, parser: ArgumentParser) -> None:
parser.add_argument('realms', metavar='<realm>', type=str, nargs='*',
help="realm to generate statistics for")
def messages_sent_by(self, user: UserProfile, week: int) -> int:
start = timezone_now() - datetime.timedelta(days=(week + 1)*7)
end = timezone_now() - datetime.timedelta(days=week*7)
return Message.objects.filter(sender=user, pub_date__gt=start, pub_date__lte=end).count()
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
print("%d users" % (len(user_profiles),))
print("%d streams" % (len(Stream.objects.filter(realm=realm)),))
for user_profile in user_profiles:
print("%35s" % (user_profile.email,), end=' ')
for week in range(10):
print("%5d" % (self.messages_sent_by(user_profile, week)), end=' ')
print("")

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
import django.db.models.deletion import django.db.models.deletion
from django.conf import settings from django.conf import settings
from django.db import migrations, models from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration): class Migration(migrations.Migration):
@@ -17,7 +19,7 @@ class Migration(migrations.Migration):
('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)), ('id', models.AutoField(verbose_name='ID', serialize=False, auto_created=True, primary_key=True)),
('info', models.CharField(max_length=1000)), ('info', models.CharField(max_length=1000)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.CreateModel( migrations.CreateModel(
name='HuddleCount', name='HuddleCount',
@@ -31,7 +33,7 @@ class Migration(migrations.Migration):
('value', models.BigIntegerField()), ('value', models.BigIntegerField()),
('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.CreateModel( migrations.CreateModel(
name='InstallationCount', name='InstallationCount',
@@ -43,7 +45,7 @@ class Migration(migrations.Migration):
('value', models.BigIntegerField()), ('value', models.BigIntegerField()),
('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.CreateModel( migrations.CreateModel(
name='RealmCount', name='RealmCount',
@@ -57,7 +59,7 @@ class Migration(migrations.Migration):
('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.CreateModel( migrations.CreateModel(
name='StreamCount', name='StreamCount',
@@ -71,7 +73,7 @@ class Migration(migrations.Migration):
('value', models.BigIntegerField()), ('value', models.BigIntegerField()),
('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.CreateModel( migrations.CreateModel(
name='UserCount', name='UserCount',
@@ -85,26 +87,26 @@ class Migration(migrations.Migration):
('value', models.BigIntegerField()), ('value', models.BigIntegerField()),
('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)), ('anomaly', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analytics.Anomaly', null=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='usercount', name='usercount',
unique_together={('user', 'property', 'end_time', 'interval')}, unique_together=set([('user', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='streamcount', name='streamcount',
unique_together={('stream', 'property', 'end_time', 'interval')}, unique_together=set([('stream', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='realmcount', name='realmcount',
unique_together={('realm', 'property', 'end_time', 'interval')}, unique_together=set([('realm', 'property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='installationcount', name='installationcount',
unique_together={('property', 'end_time', 'interval')}, unique_together=set([('property', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='huddlecount', name='huddlecount',
unique_together={('huddle', 'property', 'end_time', 'interval')}, unique_together=set([('huddle', 'property', 'end_time', 'interval')]),
), ),
] ]

View File

@@ -1,5 +1,5 @@
from django.db import migrations # -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
@@ -10,7 +10,7 @@ class Migration(migrations.Migration):
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='huddlecount', name='huddlecount',
unique_together=set(), unique_together=set([]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name='huddlecount', model_name='huddlecount',

View File

@@ -1,5 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration): class Migration(migrations.Migration):
@@ -17,6 +19,6 @@ class Migration(migrations.Migration):
('state', models.PositiveSmallIntegerField()), ('state', models.PositiveSmallIntegerField()),
('last_modified', models.DateTimeField(auto_now=True)), ('last_modified', models.DateTimeField(auto_now=True)),
], ],
bases=(models.Model,), bases=(zerver.lib.str_utils.ModelReprMixin, models.Model),
), ),
] ]

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [

View File

@@ -1,5 +1,5 @@
from django.db import migrations # -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):
@@ -10,18 +10,18 @@ class Migration(migrations.Migration):
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='installationcount', name='installationcount',
unique_together={('property', 'subgroup', 'end_time', 'interval')}, unique_together=set([('property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='realmcount', name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time', 'interval')}, unique_together=set([('realm', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='streamcount', name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time', 'interval')}, unique_together=set([('stream', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='usercount', name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time', 'interval')}, unique_together=set([('user', 'property', 'subgroup', 'end_time', 'interval')]),
), ),
] ]

View File

@@ -1,7 +1,8 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-16 20:50 # Generated by Django 1.10.4 on 2017-01-16 20:50
from django.conf import settings
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
@@ -11,7 +12,7 @@ class Migration(migrations.Migration):
operations = [ operations = [
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='installationcount', name='installationcount',
unique_together={('property', 'subgroup', 'end_time')}, unique_together=set([('property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name='installationcount', model_name='installationcount',
@@ -19,7 +20,7 @@ class Migration(migrations.Migration):
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='realmcount', name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time')}, unique_together=set([('realm', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name='realmcount', model_name='realmcount',
@@ -27,7 +28,7 @@ class Migration(migrations.Migration):
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='streamcount', name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time')}, unique_together=set([('stream', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name='streamcount', model_name='streamcount',
@@ -35,7 +36,7 @@ class Migration(migrations.Migration):
), ),
migrations.AlterUniqueTogether( migrations.AlterUniqueTogether(
name='usercount', name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time')}, unique_together=set([('user', 'property', 'subgroup', 'end_time')]),
), ),
migrations.RemoveField( migrations.RemoveField(
model_name='usercount', model_name='usercount',

View File

@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-02-01 22:28 # Generated by Django 1.10.5 on 2017-02-01 22:28
from django.db import migrations from django.db import migrations
class Migration(migrations.Migration): class Migration(migrations.Migration):
dependencies = [ dependencies = [
@@ -12,14 +12,14 @@ class Migration(migrations.Migration):
operations = [ operations = [
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name='realmcount', name='realmcount',
index_together={('property', 'end_time')}, index_together=set([('property', 'end_time')]),
), ),
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name='streamcount', name='streamcount',
index_together={('property', 'realm', 'end_time')}, index_together=set([('property', 'realm', 'end_time')]),
), ),
migrations.AlterIndexTogether( migrations.AlterIndexTogether(
name='usercount', name='usercount',
index_together={('property', 'realm', 'end_time')}, index_together=set([('property', 'realm', 'end_time')]),
), ),
] ]

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None: def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount') UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount') StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None: def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount') UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount') StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps from django.db.migrations.state import StateApps
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None: def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount') UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount') StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 08:14 # Generated by Django 1.11.6 on 2018-01-29 08:14
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):

View File

@@ -1,32 +0,0 @@
# Generated by Django 1.11.18 on 2019-02-02 02:47
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('analytics', '0012_add_on_delete'),
]
operations = [
migrations.RemoveField(
model_name='installationcount',
name='anomaly',
),
migrations.RemoveField(
model_name='realmcount',
name='anomaly',
),
migrations.RemoveField(
model_name='streamcount',
name='anomaly',
),
migrations.RemoveField(
model_name='usercount',
name='anomaly',
),
migrations.DeleteModel(
name='Anomaly',
),
]

View File

@@ -1,17 +0,0 @@
# Generated by Django 1.11.26 on 2020-01-27 04:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('analytics', '0013_remove_anomaly'),
]
operations = [
migrations.RemoveField(
model_name='fillstate',
name='last_modified',
),
]

View File

@@ -1,59 +0,0 @@
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db.models import Count, Sum
def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
"""This is a preparatory migration for our Analytics tables.
The backstory is that Django's unique_together indexes do not properly
handle the subgroup=None corner case (allowing duplicate rows that have a
subgroup of None), which meant that in race conditions, rather than updating
an existing row for the property/(realm, stream, user)/time with subgroup=None, Django would
create a duplicate row.
In the next migration, we'll add a proper constraint to fix this bug, but
we need to fix any existing problematic rows before we can add that constraint.
We fix this in an appropriate fashion for each type of CountStat object; mainly
this means deleting the extra rows, but for LoggingCountStat objects, we need to
additionally combine the sums.
"""
count_tables = dict(realm=apps.get_model('analytics', 'RealmCount'),
user=apps.get_model('analytics', 'UserCount'),
stream=apps.get_model('analytics', 'StreamCount'),
installation=apps.get_model('analytics', 'InstallationCount'))
for name, count_table in count_tables.items():
value = [name, 'property', 'end_time']
if name == 'installation':
value = ['property', 'end_time']
counts = count_table.objects.filter(subgroup=None).values(*value).annotate(
Count('id'), Sum('value')).filter(id__count__gt=1)
for count in counts:
count.pop('id__count')
total_value = count.pop('value__sum')
duplicate_counts = list(count_table.objects.filter(**count))
first_count = duplicate_counts[0]
if count['property'] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
class Migration(migrations.Migration):
dependencies = [
('analytics', '0014_remove_fillstate_last_modified'),
]
operations = [
migrations.RunPython(clear_duplicate_counts,
reverse_code=migrations.RunPython.noop),
]

View File

@@ -1,61 +0,0 @@
# Generated by Django 2.2.10 on 2020-02-29 19:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('analytics', '0015_clear_duplicate_counts'),
]
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together=set(),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('property', 'subgroup', 'end_time'), name='unique_installation_count'),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('property', 'end_time'), name='unique_installation_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('realm', 'property', 'subgroup', 'end_time'), name='unique_realm_count'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('realm', 'property', 'end_time'), name='unique_realm_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('stream', 'property', 'subgroup', 'end_time'), name='unique_stream_count'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('stream', 'property', 'end_time'), name='unique_stream_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('user', 'property', 'subgroup', 'end_time'), name='unique_user_count'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('user', 'property', 'end_time'), name='unique_user_count_null_subgroup'),
),
]

View File

@@ -1,24 +1,24 @@
import datetime import datetime
from typing import Optional from typing import Any, Dict, Optional, Text, Tuple, Union
from django.db import models from django.db import models
from django.db.models import Q, UniqueConstraint
from zerver.lib.timestamp import floor_to_day from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, Stream, UserProfile from zerver.models import Realm, Recipient, Stream, UserProfile
class FillState(models.Model): class FillState(models.Model):
property: str = models.CharField(max_length=40, unique=True) property = models.CharField(max_length=40, unique=True) # type: Text
end_time: datetime.datetime = models.DateTimeField() end_time = models.DateTimeField() # type: datetime.datetime
# Valid states are {DONE, STARTED} # Valid states are {DONE, STARTED}
DONE = 1 DONE = 1
STARTED = 2 STARTED = 2
state: int = models.PositiveSmallIntegerField() state = models.PositiveSmallIntegerField() # type: int
def __str__(self) -> str: last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
return f"<FillState: {self.property} {self.end_time} {self.state}>"
def __str__(self) -> Text:
return "<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
# The earliest/starting end_time in FillState # The earliest/starting end_time in FillState
# We assume there is at least one realm # We assume there is at least one realm
@@ -34,14 +34,22 @@ def last_successful_fill(property: str) -> Optional[datetime.datetime]:
return fillstate.end_time return fillstate.end_time
return fillstate.end_time - datetime.timedelta(hours=1) return fillstate.end_time - datetime.timedelta(hours=1)
# would only ever make entries here by hand
class Anomaly(models.Model):
info = models.CharField(max_length=1000) # type: Text
def __str__(self) -> Text:
return "<Anomaly: %s... %s>" % (self.info, self.id)
class BaseCount(models.Model): class BaseCount(models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange # Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they # the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged. # match how you'd like the table to be arranged.
property: str = models.CharField(max_length=32) property = models.CharField(max_length=32) # type: Text
subgroup: Optional[str] = models.CharField(max_length=16, null=True) subgroup = models.CharField(max_length=16, null=True) # type: Optional[Text]
end_time: datetime.datetime = models.DateTimeField() end_time = models.DateTimeField() # type: datetime.datetime
value: int = models.BigIntegerField() value = models.BigIntegerField() # type: int
anomaly = models.ForeignKey(Anomaly, on_delete=models.SET_NULL, null=True) # type: Optional[Anomaly]
class Meta: class Meta:
abstract = True abstract = True
@@ -49,83 +57,44 @@ class BaseCount(models.Model):
class InstallationCount(BaseCount): class InstallationCount(BaseCount):
class Meta: class Meta:
# Handles invalid duplicate InstallationCount data unique_together = ("property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_installation_count'),
UniqueConstraint(
fields=["property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_installation_count_null_subgroup'),
]
def __str__(self) -> str: def __str__(self) -> Text:
return f"<InstallationCount: {self.property} {self.subgroup} {self.value}>" return "<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
class RealmCount(BaseCount): class RealmCount(BaseCount):
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate RealmCount data unique_together = ("realm", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["realm", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_realm_count'),
UniqueConstraint(
fields=["realm", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_realm_count_null_subgroup'),
]
index_together = ["property", "end_time"] index_together = ["property", "end_time"]
def __str__(self) -> str: def __str__(self) -> Text:
return f"<RealmCount: {self.realm} {self.property} {self.subgroup} {self.value}>" return "<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
class UserCount(BaseCount): class UserCount(BaseCount):
user = models.ForeignKey(UserProfile, on_delete=models.CASCADE) user = models.ForeignKey(UserProfile, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate UserCount data unique_together = ("user", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["user", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_user_count'),
UniqueConstraint(
fields=["user", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_user_count_null_subgroup'),
]
# This index dramatically improves the performance of # This index dramatically improves the performance of
# aggregating from users to realms # aggregating from users to realms
index_together = ["property", "realm", "end_time"] index_together = ["property", "realm", "end_time"]
def __str__(self) -> str: def __str__(self) -> Text:
return f"<UserCount: {self.user} {self.property} {self.subgroup} {self.value}>" return "<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
class StreamCount(BaseCount): class StreamCount(BaseCount):
stream = models.ForeignKey(Stream, on_delete=models.CASCADE) stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE) realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta: class Meta:
# Handles invalid duplicate StreamCount data unique_together = ("stream", "property", "subgroup", "end_time")
constraints = [
UniqueConstraint(
fields=["stream", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_stream_count'),
UniqueConstraint(
fields=["stream", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_stream_count_null_subgroup'),
]
# This index dramatically improves the performance of # This index dramatically improves the performance of
# aggregating from streams to realms # aggregating from streams to realms
index_together = ["property", "realm", "end_time"] index_together = ["property", "realm", "end_time"]
def __str__(self) -> str: def __str__(self) -> Text:
return f"<StreamCount: {self.stream} {self.property} {self.subgroup} {self.value} {self.id}>" return "<StreamCount: %s %s %s %s %s>" % (
self.stream, self.property, self.subgroup, self.value, self.id)

View File

@@ -1,154 +1,98 @@
from datetime import datetime, timedelta, timezone
from typing import Any, Dict, List, Optional, Tuple, Type
from unittest import mock
import orjson from datetime import datetime, timedelta
from typing import Any, Dict, List, Optional, Text, Tuple, Type, Union
import ujson
from django.apps import apps from django.apps import apps
from django.db import models from django.db import models
from django.db.models import Sum from django.db.models import Sum
from django.test import TestCase
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from psycopg2.sql import SQL, Literal from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import ( from analytics.lib.counts import COUNT_STATS, CountStat, DataCollector, \
COUNT_STATS, DependentCountStat, LoggingCountStat, do_aggregate_to_summary_table, \
CountStat, do_drop_all_analytics_tables, do_drop_single_stat, \
DependentCountStat, do_fill_count_stat_at_hour, do_increment_logging_stat, \
LoggingCountStat, process_count_stat, sql_data_collector
do_aggregate_to_summary_table, from analytics.models import Anomaly, BaseCount, \
do_drop_all_analytics_tables, FillState, InstallationCount, RealmCount, StreamCount, \
do_drop_single_stat, UserCount, installation_epoch, last_successful_fill
do_fill_count_stat_at_hour, from zerver.lib.actions import do_activate_user, do_create_user, \
do_increment_logging_stat, do_deactivate_user, do_reactivate_user, update_user_activity_interval, \
get_count_stats, do_invite_users, do_revoke_user_invite, do_resend_user_invite_email, \
process_count_stat, InvitationError
sql_data_collector,
)
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
)
from zerver.lib.actions import (
InvitationError,
do_activate_user,
do_create_user,
do_deactivate_user,
do_invite_users,
do_mark_all_as_read,
do_mark_stream_messages_as_read,
do_reactivate_user,
do_resend_user_invite_email,
do_revoke_user_invite,
do_update_message_flags,
update_user_activity_interval,
)
from zerver.lib.create_user import create_user
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day
from zerver.lib.topic import DB_TOPIC_NAME from zerver.models import Client, Huddle, Message, Realm, \
from zerver.models import ( RealmAuditLog, Recipient, Stream, UserActivityInterval, \
Client, UserProfile, get_client, get_user, PreregistrationUser
Huddle,
Message,
PreregistrationUser,
Realm,
RealmAuditLog,
Recipient,
Stream,
UserActivityInterval,
UserProfile,
get_client,
get_user,
)
class AnalyticsTestCase(TestCase):
class AnalyticsTestCase(ZulipTestCase):
MINUTE = timedelta(seconds = 60) MINUTE = timedelta(seconds = 60)
HOUR = MINUTE * 60 HOUR = MINUTE * 60
DAY = HOUR * 24 DAY = HOUR * 24
TIME_ZERO = datetime(1988, 3, 14, tzinfo=timezone.utc) TIME_ZERO = datetime(1988, 3, 14).replace(tzinfo=timezone_utc)
TIME_LAST_HOUR = TIME_ZERO - HOUR TIME_LAST_HOUR = TIME_ZERO - HOUR
def setUp(self) -> None: def setUp(self) -> None:
super().setUp()
self.default_realm = Realm.objects.create( self.default_realm = Realm.objects.create(
string_id='realmtest', name='Realm Test', date_created=self.TIME_ZERO - 2*self.DAY) string_id='realmtest', name='Realm Test', date_created=self.TIME_ZERO - 2*self.DAY)
# used to generate unique names in self.create_* # used to generate unique names in self.create_*
self.name_counter = 100 self.name_counter = 100
# used as defaults in self.assertCountEquals # used as defaults in self.assertCountEquals
self.current_property: Optional[str] = None self.current_property = None # type: Optional[str]
# Lightweight creation of users, streams, and messages # Lightweight creation of users, streams, and messages
def create_user(self, **kwargs: Any) -> UserProfile: def create_user(self, **kwargs: Any) -> UserProfile:
self.name_counter += 1 self.name_counter += 1
defaults = { defaults = {
'email': f'user{self.name_counter}@domain.tld', 'email': 'user%s@domain.tld' % (self.name_counter,),
'date_joined': self.TIME_LAST_HOUR, 'date_joined': self.TIME_LAST_HOUR,
'full_name': 'full_name', 'full_name': 'full_name',
'is_active': True, 'short_name': 'short_name',
'is_bot': False, 'pointer': -1,
'realm': self.default_realm} 'last_pointer_updater': 'seems unused?',
'realm': self.default_realm,
'api_key': '42'}
for key, value in defaults.items(): for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value) kwargs[key] = kwargs.get(key, value)
kwargs['delivery_email'] = kwargs['email'] return UserProfile.objects.create(**kwargs)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=kwargs['date_joined']):
pass_kwargs: Dict[str, Any] = {}
if kwargs['is_bot']:
pass_kwargs['bot_type'] = UserProfile.DEFAULT_BOT
pass_kwargs['bot_owner'] = None
return create_user(
kwargs['email'],
'password',
kwargs['realm'],
active=kwargs['is_active'],
full_name=kwargs['full_name'],
role=UserProfile.ROLE_REALM_ADMINISTRATOR,
**pass_kwargs
)
def create_stream_with_recipient(self, **kwargs: Any) -> Tuple[Stream, Recipient]: def create_stream_with_recipient(self, **kwargs: Any) -> Tuple[Stream, Recipient]:
self.name_counter += 1 self.name_counter += 1
defaults = {'name': f'stream name {self.name_counter}', defaults = {'name': 'stream name %s' % (self.name_counter,),
'realm': self.default_realm, 'realm': self.default_realm,
'date_created': self.TIME_LAST_HOUR} 'date_created': self.TIME_LAST_HOUR}
for key, value in defaults.items(): for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value) kwargs[key] = kwargs.get(key, value)
stream = Stream.objects.create(**kwargs) stream = Stream.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM) recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
return stream, recipient return stream, recipient
def create_huddle_with_recipient(self, **kwargs: Any) -> Tuple[Huddle, Recipient]: def create_huddle_with_recipient(self, **kwargs: Any) -> Tuple[Huddle, Recipient]:
self.name_counter += 1 self.name_counter += 1
defaults = {'huddle_hash': f'hash{self.name_counter}'} defaults = {'huddle_hash': 'hash%s' % (self.name_counter,)}
for key, value in defaults.items(): for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value) kwargs[key] = kwargs.get(key, value)
huddle = Huddle.objects.create(**kwargs) huddle = Huddle.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=huddle.id, type=Recipient.HUDDLE) recipient = Recipient.objects.create(type_id=huddle.id, type=Recipient.HUDDLE)
huddle.recipient = recipient
huddle.save(update_fields=["recipient"])
return huddle, recipient return huddle, recipient
def create_message(self, sender: UserProfile, recipient: Recipient, **kwargs: Any) -> Message: def create_message(self, sender: UserProfile, recipient: Recipient, **kwargs: Any) -> Message:
defaults = { defaults = {
'sender': sender, 'sender': sender,
'recipient': recipient, 'recipient': recipient,
DB_TOPIC_NAME: 'subject', 'subject': 'subject',
'content': 'hi', 'content': 'hi',
'date_sent': self.TIME_LAST_HOUR, 'pub_date': self.TIME_LAST_HOUR,
'sending_client': get_client("website")} 'sending_client': get_client("website")}
for key, value in defaults.items(): for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value) kwargs[key] = kwargs.get(key, value)
return Message.objects.create(**kwargs) return Message.objects.create(**kwargs)
# kwargs should only ever be a UserProfile or Stream. # kwargs should only ever be a UserProfile or Stream.
def assertCountEquals(self, table: Type[BaseCount], value: int, property: Optional[str]=None, def assertCountEquals(self, table: Type[BaseCount], value: int, property: Optional[Text]=None,
subgroup: Optional[str]=None, end_time: datetime=TIME_ZERO, subgroup: Optional[Text]=None, end_time: datetime=TIME_ZERO,
realm: Optional[Realm]=None, **kwargs: models.Model) -> None: realm: Optional[Realm]=None, **kwargs: models.Model) -> None:
if property is None: if property is None:
property = self.current_property property = self.current_property
@@ -190,7 +134,7 @@ class AnalyticsTestCase(ZulipTestCase):
'end_time': self.TIME_ZERO, 'end_time': self.TIME_ZERO,
'value': 1} 'value': 1}
for values in arg_values: for values in arg_values:
kwargs: Dict[str, Any] = {} kwargs = {} # type: Dict[str, Any]
for i in range(len(values)): for i in range(len(values)):
kwargs[arg_keys[i]] = values[i] kwargs[arg_keys[i]] = values[i]
for key, value in defaults.items(): for key, value in defaults.items():
@@ -208,13 +152,8 @@ class AnalyticsTestCase(ZulipTestCase):
class TestProcessCountStat(AnalyticsTestCase): class TestProcessCountStat(AnalyticsTestCase):
def make_dummy_count_stat(self, property: str) -> CountStat: def make_dummy_count_stat(self, property: str) -> CountStat:
query = lambda kwargs: SQL(""" query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
INSERT INTO analytics_realmcount (realm_id, value, property, end_time) VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, property)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal(property),
)
return CountStat(property, sql_data_collector(RealmCount, query, None), CountStat.HOUR) return CountStat(property, sql_data_collector(RealmCount, query, None), CountStat.HOUR)
def assertFillStateEquals(self, stat: CountStat, end_time: datetime, def assertFillStateEquals(self, stat: CountStat, end_time: datetime,
@@ -312,13 +251,8 @@ class TestProcessCountStat(AnalyticsTestCase):
def test_process_dependent_stat(self) -> None: def test_process_dependent_stat(self) -> None:
stat1 = self.make_dummy_count_stat('stat1') stat1 = self.make_dummy_count_stat('stat1')
stat2 = self.make_dummy_count_stat('stat2') stat2 = self.make_dummy_count_stat('stat2')
query = lambda kwargs: SQL(""" query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
INSERT INTO analytics_realmcount (realm_id, value, property, end_time) VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat3')
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat3'),
)
stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None), stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None),
CountStat.HOUR, CountStat.HOUR,
dependencies=['stat1', 'stat2']) dependencies=['stat1', 'stat2'])
@@ -351,13 +285,8 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertFillStateEquals(stat3, hour[2]) self.assertFillStateEquals(stat3, hour[2])
# test daily dependent stat with hourly dependencies # test daily dependent stat with hourly dependencies
query = lambda kwargs: SQL(""" query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
INSERT INTO analytics_realmcount (realm_id, value, property, end_time) VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat4')
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat4'),
)
stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None), stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None),
CountStat.DAY, CountStat.DAY,
dependencies=['stat1', 'stat2']) dependencies=['stat1', 'stat2'])
@@ -380,12 +309,12 @@ class TestCountStats(AnalyticsTestCase):
date_created=self.TIME_ZERO-2*self.DAY) date_created=self.TIME_ZERO-2*self.DAY)
for minutes_ago in [0, 1, 61, 60*24+1]: for minutes_ago in [0, 1, 61, 60*24+1]:
creation_time = self.TIME_ZERO - minutes_ago*self.MINUTE creation_time = self.TIME_ZERO - minutes_ago*self.MINUTE
user = self.create_user(email=f'user-{minutes_ago}@second.analytics', user = self.create_user(email='user-%s@second.analytics' % (minutes_ago,),
realm=self.second_realm, date_joined=creation_time) realm=self.second_realm, date_joined=creation_time)
recipient = self.create_stream_with_recipient( recipient = self.create_stream_with_recipient(
name=f'stream {minutes_ago}', realm=self.second_realm, name='stream %s' % (minutes_ago,), realm=self.second_realm,
date_created=creation_time)[1] date_created=creation_time)[1]
self.create_message(user, recipient, date_sent=creation_time) self.create_message(user, recipient, pub_date=creation_time)
self.hourly_user = get_user('user-1@second.analytics', self.second_realm) self.hourly_user = get_user('user-1@second.analytics', self.second_realm)
self.daily_user = get_user('user-61@second.analytics', self.second_realm) self.daily_user = get_user('user-61@second.analytics', self.second_realm)
@@ -423,29 +352,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(UserCount, [], []) self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_active_users_by_is_bot_for_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['active_users:is_bot:day']
self.current_property = stat.property
# To be included
self.create_user(is_bot=True, date_joined=self.TIME_ZERO-25*self.HOUR)
self.create_user(is_bot=False)
# To be excluded
self.create_user(email='test@second.analytics',
realm=self.second_realm, date_joined=self.TIME_ZERO-2*self.DAY)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'true'], [1, 'false']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot(self) -> None: def test_messages_sent_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_sent:is_bot:hour'] stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property self.current_property = stat.property
@@ -453,8 +359,8 @@ class TestCountStats(AnalyticsTestCase):
bot = self.create_user(is_bot=True) bot = self.create_user(is_bot=True)
human1 = self.create_user() human1 = self.create_user()
human2 = self.create_user() human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id, recipient_human1 = Recipient.objects.create(type_id=human1.id,
type=Recipient.PERSONAL) type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1] recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1] recipient_huddle = self.create_huddle_with_recipient()[1]
@@ -475,46 +381,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[3, 'false'], [3, 'true']]) self.assertTableState(InstallationCount, ['value', 'subgroup'], [[3, 'false'], [3, 'true']])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(bot, recipient_human1)
self.create_message(bot, recipient_stream)
self.create_message(bot, recipient_huddle)
self.create_message(human1, recipient_human1)
self.create_message(human2, recipient_human1)
# To be excluded
self.create_message(self.hourly_user, recipient_human1)
self.create_message(self.hourly_user, recipient_stream)
self.create_message(self.hourly_user, recipient_huddle)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'false', human1], [1, 'false', human2],
[3, 'true', bot]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[2, 'false', self.default_realm],
[3, 'true', self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type(self) -> None: def test_messages_sent_by_message_type(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day'] stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property self.current_property = stat.property
@@ -548,9 +414,9 @@ class TestCountStats(AnalyticsTestCase):
self.create_message(user2, recipient_huddle2) self.create_message(user2, recipient_huddle2)
# private messages # private messages
recipient_user1 = Recipient.objects.get(type_id=user1.id, type=Recipient.PERSONAL) recipient_user1 = Recipient.objects.create(type_id=user1.id, type=Recipient.PERSONAL)
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL) recipient_user2 = Recipient.objects.create(type_id=user2.id, type=Recipient.PERSONAL)
recipient_user3 = Recipient.objects.get(type_id=user3.id, type=Recipient.PERSONAL) recipient_user3 = Recipient.objects.create(type_id=user3.id, type=Recipient.PERSONAL)
self.create_message(user1, recipient_user2) self.create_message(user1, recipient_user2)
self.create_message(user2, recipient_user1) self.create_message(user2, recipient_user1)
self.create_message(user3, recipient_user3) self.create_message(user3, recipient_user3)
@@ -577,49 +443,12 @@ class TestCountStats(AnalyticsTestCase):
[2, 'huddle_message']]) [2, 'huddle_message']])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
user = self.create_user()
user_recipient = Recipient.objects.get(type_id=user.id, type=Recipient.PERSONAL)
private_stream_recipient = self.create_stream_with_recipient(invite_only=True)[1]
stream_recipient = self.create_stream_with_recipient()[1]
huddle_recipient = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(user, user_recipient)
self.create_message(user, private_stream_recipient)
self.create_message(user, stream_recipient)
self.create_message(user, huddle_recipient)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
# To be excluded
self.create_message(self.hourly_user, user_recipient)
self.create_message(self.hourly_user, private_stream_recipient)
self.create_message(self.hourly_user, stream_recipient)
self.create_message(self.hourly_user, huddle_recipient)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'private_message', user], [1, 'private_stream', user],
[1, 'huddle_message', user], [1, 'public_stream', user]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'private_message'], [1, 'private_stream'],
[1, 'public_stream'], [1, 'huddle_message']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_recipients_with_same_id(self) -> None: def test_messages_sent_to_recipients_with_same_id(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day'] stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property self.current_property = stat.property
user = self.create_user(id=1000) user = self.create_user(id=1000)
user_recipient = Recipient.objects.get(type_id=user.id, type=Recipient.PERSONAL) user_recipient = Recipient.objects.create(type_id=user.id, type=Recipient.PERSONAL)
stream_recipient = self.create_stream_with_recipient(id=1000)[1] stream_recipient = self.create_stream_with_recipient(id=1000)[1]
huddle_recipient = self.create_huddle_with_recipient(id=1000)[1] huddle_recipient = self.create_huddle_with_recipient(id=1000)[1]
@@ -639,7 +468,7 @@ class TestCountStats(AnalyticsTestCase):
user1 = self.create_user(is_bot=True) user1 = self.create_user(is_bot=True)
user2 = self.create_user() user2 = self.create_user()
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL) recipient_user2 = Recipient.objects.create(type_id=user2.id, type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1] recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1] recipient_huddle = self.create_huddle_with_recipient()[1]
@@ -668,42 +497,6 @@ class TestCountStats(AnalyticsTestCase):
[[4, website_client_id], [3, client2_id]]) [[4, website_client_id], [3, client2_id]])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_client_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:client:day']
self.current_property = stat.property
user1 = self.create_user(is_bot=True)
user2 = self.create_user()
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL)
client2 = Client.objects.create(name='client2')
# TO be included
self.create_message(user1, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2)
# To be excluded
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
client2_id = str(client2.id)
website_client_id = str(get_client('website').id) # default for self.create_message
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, client2_id, user1], [1, client2_id, user2],
[1, website_client_id, user2]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, website_client_id], [2, client2_id]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_stream_by_is_bot(self) -> None: def test_messages_sent_to_stream_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_in_stream:is_bot:day'] stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property self.current_property = stat.property
@@ -711,7 +504,7 @@ class TestCountStats(AnalyticsTestCase):
bot = self.create_user(is_bot=True) bot = self.create_user(is_bot=True)
human1 = self.create_user() human1 = self.create_user()
human2 = self.create_user() human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id, type=Recipient.PERSONAL) recipient_human1 = Recipient.objects.create(type_id=human1.id, type=Recipient.PERSONAL)
stream1, recipient_stream1 = self.create_stream_with_recipient() stream1, recipient_stream1 = self.create_stream_with_recipient()
stream2, recipient_stream2 = self.create_stream_with_recipient() stream2, recipient_stream2 = self.create_stream_with_recipient()
@@ -741,116 +534,12 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[5, 'false'], [2, 'true']]) self.assertTableState(InstallationCount, ['value', 'subgroup'], [[5, 'false'], [2, 'true']])
self.assertTableState(UserCount, [], []) self.assertTableState(UserCount, [], [])
def test_messages_sent_to_stream_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
human1 = self.create_user()
bot = self.create_user(is_bot=True)
realm = {'realm': self.second_realm}
stream1, recipient_stream1 = self.create_stream_with_recipient()
stream2, recipient_stream2 = self.create_stream_with_recipient(**realm)
# To be included
self.create_message(human1, recipient_stream1)
self.create_message(bot, recipient_stream1)
# To be excluded
self.create_message(self.hourly_user, recipient_stream2)
self.create_message(self.daily_user, recipient_stream2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(StreamCount, ['value', 'subgroup', 'stream'],
[[1, 'false', stream1],
[1, 'true', stream1]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[1, 'false'], [1, 'true']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
def create_interval(self, user: UserProfile, start_offset: timedelta, def create_interval(self, user: UserProfile, start_offset: timedelta,
end_offset: timedelta) -> None: end_offset: timedelta) -> None:
UserActivityInterval.objects.create( UserActivityInterval.objects.create(
user_profile=user, start=self.TIME_ZERO-start_offset, user_profile=user, start=self.TIME_ZERO-start_offset,
end=self.TIME_ZERO-end_offset) end=self.TIME_ZERO-end_offset)
def test_1day_actives(self) -> None:
stat = COUNT_STATS['1day_actives::day']
self.current_property = stat.property
_1day = 1*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
# Outside time range, should not appear. Also tests upper boundary.
user1 = self.create_user()
self.create_interval(user1, _1day + self.DAY, _1day + timedelta(seconds=1))
self.create_interval(user1, timedelta(0), -self.HOUR)
# On lower boundary, should appear
user2 = self.create_user()
self.create_interval(user2, _1day + self.DAY, _1day)
# Multiple intervals, including one outside boundary
user3 = self.create_user()
self.create_interval(user3, 2*self.DAY, 1*self.DAY)
self.create_interval(user3, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
# Intervals crossing boundary
user4 = self.create_user()
self.create_interval(user4, 1.5*self.DAY, 0.5*self.DAY)
user5 = self.create_user()
self.create_interval(user5, self.MINUTE, -self.MINUTE)
# Interval subsuming time range
user6 = self.create_user()
self.create_interval(user6, 2*self.DAY, -2*self.DAY)
# Second realm
user7 = self.create_user(realm=self.second_realm)
self.create_interval(user7, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user2], [1, user3], [1, user4], [1, user5], [1, user6], [1, user7]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[5, self.default_realm], [1, self.second_realm]])
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_1day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['1day_actives::day']
self.current_property = stat.property
_1day = 1*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, _1day + self.DAY, _1day)
# To be excluded
user3 = self.create_user(realm=self.second_realm)
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user2], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_15day_actives(self) -> None: def test_15day_actives(self) -> None:
stat = COUNT_STATS['15day_actives::day'] stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property self.current_property = stat.property
@@ -894,36 +583,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]]) self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_15day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
_15day = 15*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, _15day + self.DAY, _15day)
self.create_interval(user2, 20*self.HOUR, 19*self.HOUR)
# To be excluded
self.create_interval(user3, 20*self.HOUR, 19*self.HOUR)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_minutes_active(self) -> None: def test_minutes_active(self) -> None:
stat = COUNT_STATS['minutes_active::day'] stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property self.current_property = stat.property
@@ -966,35 +625,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[61 + 121 + 24*60 + 1]]) self.assertTableState(InstallationCount, ['value'], [[61 + 121 + 24*60 + 1]])
self.assertTableState(StreamCount, [], []) self.assertTableState(StreamCount, [], [])
def test_minutes_active_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
# Outside time range, should not appear. Also testing for intervals
# starting and ending on boundary
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, 20*self.MINUTE, 19*self.MINUTE)
# To be excluded
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[60, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[60 + 1, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
class TestDoAggregateToSummaryTable(AnalyticsTestCase): class TestDoAggregateToSummaryTable(AnalyticsTestCase):
# do_aggregate_to_summary_table is mostly tested by the end to end # do_aggregate_to_summary_table is mostly tested by the end to end
# nature of the tests in TestCountStats. But want to highlight one # nature of the tests in TestCountStats. But want to highlight one
@@ -1061,12 +691,12 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
self.current_property = 'test' self.current_property = 'test'
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'], self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[1, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO], [[1, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]]) [1, 'subgroup1', self.TIME_LAST_HOUR]])
# This should trigger the get part of get_or_create # This should trigger the get part of get_or_create
do_increment_logging_stat(self.default_realm, stat, 'subgroup1', self.TIME_ZERO) do_increment_logging_stat(self.default_realm, stat, 'subgroup1', self.TIME_ZERO)
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'], self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[2, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO], [[2, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]]) [1, 'subgroup1', self.TIME_LAST_HOUR]])
def test_increment(self) -> None: def test_increment(self) -> None:
stat = LoggingCountStat('test', RealmCount, CountStat.DAY) stat = LoggingCountStat('test', RealmCount, CountStat.DAY)
@@ -1103,7 +733,7 @@ class TestLoggingCountStats(AnalyticsTestCase):
def test_active_users_log_by_is_bot(self) -> None: def test_active_users_log_by_is_bot(self) -> None:
property = 'active_users_log:is_bot:day' property = 'active_users_log:is_bot:day'
user = do_create_user('email', 'password', self.default_realm, 'full_name') user = do_create_user('email', 'password', self.default_realm, 'full_name', 'short_name')
self.assertEqual(1, RealmCount.objects.filter(property=property, subgroup=False) self.assertEqual(1, RealmCount.objects.filter(property=property, subgroup=False)
.aggregate(Sum('value'))['value__sum']) .aggregate(Sum('value'))['value__sum'])
do_deactivate_user(user) do_deactivate_user(user)
@@ -1158,39 +788,6 @@ class TestLoggingCountStats(AnalyticsTestCase):
do_resend_user_invite_email(PreregistrationUser.objects.first()) do_resend_user_invite_email(PreregistrationUser.objects.first())
assertInviteCountEquals(6) assertInviteCountEquals(6)
def test_messages_read_hour(self) -> None:
read_count_property = 'messages_read::hour'
interactions_property = 'messages_read_interactions::hour'
user1 = self.create_user()
user2 = self.create_user()
stream, recipient = self.create_stream_with_recipient()
self.subscribe(user1, stream.name)
self.subscribe(user2, stream.name)
self.send_personal_message(user1, user2)
client = get_client("website")
do_mark_all_as_read(user2, client)
self.assertEqual(1, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(1, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
self.send_stream_message(user1, stream.name)
self.send_stream_message(user1, stream.name)
do_mark_stream_messages_as_read(user2, client, stream)
self.assertEqual(3, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(2, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
message = self.send_stream_message(user2, stream.name)
do_update_message_flags(user1, client, 'add', 'read', [message])
self.assertEqual(4, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(3, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
class TestDeleteStats(AnalyticsTestCase): class TestDeleteStats(AnalyticsTestCase):
def test_do_drop_all_analytics_tables(self) -> None: def test_do_drop_all_analytics_tables(self) -> None:
user = self.create_user() user = self.create_user()
@@ -1202,6 +799,7 @@ class TestDeleteStats(AnalyticsTestCase):
RealmCount.objects.create(realm=user.realm, **count_args) RealmCount.objects.create(realm=user.realm, **count_args)
InstallationCount.objects.create(**count_args) InstallationCount.objects.create(**count_args)
FillState.objects.create(property='test', end_time=self.TIME_ZERO, state=FillState.DONE) FillState.objects.create(property='test', end_time=self.TIME_ZERO, state=FillState.DONE)
Anomaly.objects.create(info='test anomaly')
analytics = apps.get_app_config('analytics') analytics = apps.get_app_config('analytics')
for table in list(analytics.models.values()): for table in list(analytics.models.values()):
@@ -1224,6 +822,7 @@ class TestDeleteStats(AnalyticsTestCase):
InstallationCount.objects.create(**count_args) InstallationCount.objects.create(**count_args)
FillState.objects.create(property='to_delete', end_time=self.TIME_ZERO, state=FillState.DONE) FillState.objects.create(property='to_delete', end_time=self.TIME_ZERO, state=FillState.DONE)
FillState.objects.create(property='to_save', end_time=self.TIME_ZERO, state=FillState.DONE) FillState.objects.create(property='to_save', end_time=self.TIME_ZERO, state=FillState.DONE)
Anomaly.objects.create(info='test anomaly')
analytics = apps.get_app_config('analytics') analytics = apps.get_app_config('analytics')
for table in list(analytics.models.values()): for table in list(analytics.models.values()):
@@ -1231,8 +830,11 @@ class TestDeleteStats(AnalyticsTestCase):
do_drop_single_stat('to_delete') do_drop_single_stat('to_delete')
for table in list(analytics.models.values()): for table in list(analytics.models.values()):
self.assertFalse(table.objects.filter(property='to_delete').exists()) if table._meta.db_table == 'analytics_anomaly':
self.assertTrue(table.objects.filter(property='to_save').exists()) self.assertTrue(table.objects.exists())
else:
self.assertFalse(table.objects.filter(property='to_delete').exists())
self.assertTrue(table.objects.filter(property='to_save').exists())
class TestActiveUsersAudit(AnalyticsTestCase): class TestActiveUsersAudit(AnalyticsTestCase):
def setUp(self) -> None: def setUp(self) -> None:
@@ -1241,7 +843,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.stat = COUNT_STATS['active_users_audit:is_bot:day'] self.stat = COUNT_STATS['active_users_audit:is_bot:day']
self.current_property = self.stat.property self.current_property = self.stat.property
def add_event(self, event_type: int, days_offset: float, def add_event(self, event_type: str, days_offset: float,
user: Optional[UserProfile]=None) -> None: user: Optional[UserProfile]=None) -> None:
hours_offset = int(24*days_offset) hours_offset = int(24*days_offset)
if user is None: if user is None:
@@ -1251,49 +853,49 @@ class TestActiveUsersAudit(AnalyticsTestCase):
event_time=self.TIME_ZERO - hours_offset*self.HOUR) event_time=self.TIME_ZERO - hours_offset*self.HOUR)
def test_user_deactivated_in_future(self) -> None: def test_user_deactivated_in_future(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1) self.add_event('user_created', 1)
self.add_event(RealmAuditLog.USER_DEACTIVATED, 0) self.add_event('user_deactivated', 0)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']]) self.assertTableState(UserCount, ['subgroup'], [['false']])
def test_user_reactivated_in_future(self) -> None: def test_user_reactivated_in_future(self) -> None:
self.add_event(RealmAuditLog.USER_DEACTIVATED, 1) self.add_event('user_deactivated', 1)
self.add_event(RealmAuditLog.USER_REACTIVATED, 0) self.add_event('user_reactivated', 0)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], []) self.assertTableState(UserCount, [], [])
def test_user_active_then_deactivated_same_day(self) -> None: def test_user_active_then_deactivated_same_day(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1) self.add_event('user_created', 1)
self.add_event(RealmAuditLog.USER_DEACTIVATED, .5) self.add_event('user_deactivated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], []) self.assertTableState(UserCount, [], [])
def test_user_unactive_then_activated_same_day(self) -> None: def test_user_unactive_then_activated_same_day(self) -> None:
self.add_event(RealmAuditLog.USER_DEACTIVATED, 1) self.add_event('user_deactivated', 1)
self.add_event(RealmAuditLog.USER_REACTIVATED, .5) self.add_event('user_reactivated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']]) self.assertTableState(UserCount, ['subgroup'], [['false']])
# Arguably these next two tests are duplicates of the _in_future tests, but are # Arguably these next two tests are duplicates of the _in_future tests, but are
# a guard against future refactorings where they may no longer be duplicates # a guard against future refactorings where they may no longer be duplicates
def test_user_active_then_deactivated_with_day_gap(self) -> None: def test_user_active_then_deactivated_with_day_gap(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 2) self.add_event('user_created', 2)
self.add_event(RealmAuditLog.USER_DEACTIVATED, 1) self.add_event('user_deactivated', 1)
process_count_stat(self.stat, self.TIME_ZERO) process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup', 'end_time'], self.assertTableState(UserCount, ['subgroup', 'end_time'],
[['false', self.TIME_ZERO - self.DAY]]) [['false', self.TIME_ZERO - self.DAY]])
def test_user_deactivated_then_reactivated_with_day_gap(self) -> None: def test_user_deactivated_then_reactivated_with_day_gap(self) -> None:
self.add_event(RealmAuditLog.USER_DEACTIVATED, 2) self.add_event('user_deactivated', 2)
self.add_event(RealmAuditLog.USER_REACTIVATED, 1) self.add_event('user_reactivated', 1)
process_count_stat(self.stat, self.TIME_ZERO) process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']]) self.assertTableState(UserCount, ['subgroup'], [['false']])
def test_event_types(self) -> None: def test_event_types(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 4) self.add_event('user_created', 4)
self.add_event(RealmAuditLog.USER_DEACTIVATED, 3) self.add_event('user_deactivated', 3)
self.add_event(RealmAuditLog.USER_ACTIVATED, 2) self.add_event('user_activated', 2)
self.add_event(RealmAuditLog.USER_REACTIVATED, 1) self.add_event('user_reactivated', 1)
for i in range(4): for i in range(4):
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO - i*self.DAY) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO - i*self.DAY)
self.assertTableState(UserCount, ['subgroup', 'end_time'], self.assertTableState(UserCount, ['subgroup', 'end_time'],
@@ -1308,7 +910,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
user3 = self.create_user(realm=second_realm) user3 = self.create_user(realm=second_realm)
user4 = self.create_user(realm=second_realm, is_bot=True) user4 = self.create_user(realm=second_realm, is_bot=True)
for user in [user1, user2, user3, user4]: for user in [user1, user2, user3, user4]:
self.add_event(RealmAuditLog.USER_CREATED, 1, user=user) self.add_event('user_created', 1, user=user)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup', 'user'], self.assertTableState(UserCount, ['subgroup', 'user'],
[['false', user1], ['false', user2], ['false', user3], ['true', user4]]) [['false', user1], ['false', user2], ['false', user3], ['true', user4]])
@@ -1326,7 +928,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# CountStat.HOUR from CountStat.DAY, this will fail, while many of the # CountStat.HOUR from CountStat.DAY, this will fail, while many of the
# tests above will not. # tests above will not.
def test_update_from_two_days_ago(self) -> None: def test_update_from_two_days_ago(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 2) self.add_event('user_created', 2)
process_count_stat(self.stat, self.TIME_ZERO) process_count_stat(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup', 'end_time'], self.assertTableState(UserCount, ['subgroup', 'end_time'],
[['false', self.TIME_ZERO], ['false', self.TIME_ZERO-self.DAY]]) [['false', self.TIME_ZERO], ['false', self.TIME_ZERO-self.DAY]])
@@ -1335,22 +937,22 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# doesn't go through do_create_user. Mainly just want to make sure that # doesn't go through do_create_user. Mainly just want to make sure that
# that situation doesn't throw an error. # that situation doesn't throw an error.
def test_empty_realm_or_user_with_no_relevant_activity(self) -> None: def test_empty_realm_or_user_with_no_relevant_activity(self) -> None:
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, 1) self.add_event('unrelated', 1)
self.create_user() # also test a user with no RealmAuditLog entries self.create_user() # also test a user with no RealmAuditLog entries
Realm.objects.create(string_id='moo', name='moo') Realm.objects.create(string_id='moo', name='moo')
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, [], []) self.assertTableState(UserCount, [], [])
def test_max_audit_entry_is_unrelated(self) -> None: def test_max_audit_entry_is_unrelated(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1) self.add_event('user_created', 1)
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, .5) self.add_event('unrelated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']]) self.assertTableState(UserCount, ['subgroup'], [['false']])
# Simultaneous related audit entries should not be allowed, and so not testing for that. # Simultaneous related audit entries should not be allowed, and so not testing for that.
def test_simultaneous_unrelated_audit_entry(self) -> None: def test_simultaneous_unrelated_audit_entry(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1) self.add_event('user_created', 1)
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, 1) self.add_event('unrelated', 1)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']]) self.assertTableState(UserCount, ['subgroup'], [['false']])
@@ -1358,19 +960,19 @@ class TestActiveUsersAudit(AnalyticsTestCase):
user1 = self.create_user() user1 = self.create_user()
user2 = self.create_user() user2 = self.create_user()
user3 = self.create_user() user3 = self.create_user()
self.add_event(RealmAuditLog.USER_CREATED, .5, user=user1) self.add_event('user_created', .5, user=user1)
self.add_event(RealmAuditLog.USER_CREATED, .5, user=user2) self.add_event('user_created', .5, user=user2)
self.add_event(RealmAuditLog.USER_CREATED, 1, user=user3) self.add_event('user_created', 1, user=user3)
self.add_event(RealmAuditLog.USER_DEACTIVATED, .5, user=user3) self.add_event('user_deactivated', .5, user=user3)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['user', 'subgroup'], self.assertTableState(UserCount, ['user', 'subgroup'],
[[user1, 'false'], [user2, 'false']]) [[user1, 'false'], [user2, 'false']])
def test_end_to_end_with_actions_dot_py(self) -> None: def test_end_to_end_with_actions_dot_py(self) -> None:
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name') user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name') user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
user3 = do_create_user('email3', 'password', self.default_realm, 'full_name') user3 = do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')
user4 = do_create_user('email4', 'password', self.default_realm, 'full_name') user4 = do_create_user('email4', 'password', self.default_realm, 'full_name', 'short_name')
do_deactivate_user(user2) do_deactivate_user(user2)
do_activate_user(user3) do_activate_user(user3)
do_reactivate_user(user4) do_reactivate_user(user4)
@@ -1380,7 +982,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.assertTrue(UserCount.objects.filter( self.assertTrue(UserCount.objects.filter(
user=user, property=self.current_property, subgroup='false', user=user, property=self.current_property, subgroup='false',
end_time=end_time, value=1).exists()) end_time=end_time, value=1).exists())
self.assertFalse(UserCount.objects.filter(user=user2, end_time=end_time).exists()) self.assertFalse(UserCount.objects.filter(user=user2).exists())
class TestRealmActiveHumans(AnalyticsTestCase): class TestRealmActiveHumans(AnalyticsTestCase):
def setUp(self) -> None: def setUp(self) -> None:
@@ -1393,7 +995,7 @@ class TestRealmActiveHumans(AnalyticsTestCase):
end_time = self.TIME_ZERO end_time = self.TIME_ZERO
UserCount.objects.create( UserCount.objects.create(
user=user, realm=user.realm, property='active_users_audit:is_bot:day', user=user, realm=user.realm, property='active_users_audit:is_bot:day',
subgroup=orjson.dumps(user.is_bot).decode(), end_time=end_time, value=1) subgroup=ujson.dumps(user.is_bot), end_time=end_time, value=1)
def mark_15day_active(self, user: UserProfile, end_time: Optional[datetime]=None) -> None: def mark_15day_active(self, user: UserProfile, end_time: Optional[datetime]=None) -> None:
if end_time is None: if end_time is None:
@@ -1452,7 +1054,6 @@ class TestRealmActiveHumans(AnalyticsTestCase):
self.create_user(realm=third_realm) self.create_user(realm=third_realm)
RealmCount.objects.all().delete() RealmCount.objects.all().delete()
InstallationCount.objects.all().delete()
for i in [-1, 0, 1]: for i in [-1, 0, 1]:
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO + i*self.DAY) do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO + i*self.DAY)
self.assertTableState(RealmCount, ['value', 'realm', 'end_time'], self.assertTableState(RealmCount, ['value', 'realm', 'end_time'],
@@ -1462,9 +1063,9 @@ class TestRealmActiveHumans(AnalyticsTestCase):
[2, second_realm, self.TIME_ZERO - self.DAY]]) [2, second_realm, self.TIME_ZERO - self.DAY]])
def test_end_to_end(self) -> None: def test_end_to_end(self) -> None:
user1 = do_create_user('email1', 'password', self.default_realm, 'full_name') user1 = do_create_user('email1', 'password', self.default_realm, 'full_name', 'short_name')
user2 = do_create_user('email2', 'password', self.default_realm, 'full_name') user2 = do_create_user('email2', 'password', self.default_realm, 'full_name', 'short_name')
do_create_user('email3', 'password', self.default_realm, 'full_name') do_create_user('email3', 'password', self.default_realm, 'full_name', 'short_name')
time_zero = floor_to_day(timezone_now()) + self.DAY time_zero = floor_to_day(timezone_now()) + self.DAY
update_user_activity_interval(user1, time_zero) update_user_activity_interval(user1, time_zero)
update_user_activity_interval(user2, time_zero) update_user_activity_interval(user2, time_zero)

View File

@@ -2,7 +2,6 @@ from analytics.lib.counts import CountStat
from analytics.lib.fixtures import generate_time_series_data from analytics.lib.fixtures import generate_time_series_data
from zerver.lib.test_classes import ZulipTestCase from zerver.lib.test_classes import ZulipTestCase
# A very light test suite; the code being tested is not run in production. # A very light test suite; the code being tested is not run in production.
class TestFixtures(ZulipTestCase): class TestFixtures(ZulipTestCase):
def test_deterministic_settings(self) -> None: def test_deterministic_settings(self) -> None:

View File

@@ -1,89 +1,34 @@
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta
from typing import List, Optional from typing import Dict, List, Optional
from unittest import mock
import orjson import mock
from django.http import HttpResponse from django.utils.timezone import utc
from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.lib.time_utils import time_range from analytics.lib.time_utils import time_range
from analytics.models import FillState, RealmCount, UserCount, last_successful_fill from analytics.models import FillState, \
from analytics.views import rewrite_client_arrays, sort_by_totals, sort_client_labels RealmCount, UserCount, last_successful_fill
from corporate.lib.stripe import add_months, update_sponsorship_status from analytics.views import get_chart_data, rewrite_client_arrays, \
from corporate.models import Customer, CustomerPlan, LicenseLedger, get_customer_by_realm sort_by_totals, sort_client_labels, stats
from zerver.lib.actions import do_create_multiuse_invite_link, do_send_realm_reactivation_email
from zerver.lib.test_classes import ZulipTestCase from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.test_helpers import reset_emails_in_zulip_realm from zerver.lib.timestamp import ceiling_to_day, \
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, datetime_to_timestamp ceiling_to_hour, datetime_to_timestamp
from zerver.models import ( from zerver.models import Client, get_realm
Client,
MultiuseInvite,
PreregistrationUser,
Realm,
UserMessage,
UserProfile,
get_realm,
)
class TestStatsEndpoint(ZulipTestCase): class TestStatsEndpoint(ZulipTestCase):
def test_stats(self) -> None: def test_stats(self) -> None:
self.user = self.example_user('hamlet') self.user = self.example_user('hamlet')
self.login_user(self.user) self.login(self.user.email)
result = self.client_get('/stats') result = self.client_get('/stats')
self.assertEqual(result.status_code, 200) self.assertEqual(result.status_code, 200)
# Check that we get something back # Check that we get something back
self.assert_in_response("Zulip analytics for", result) self.assert_in_response("Zulip analytics for", result)
def test_guest_user_cant_access_stats(self) -> None:
self.user = self.example_user('polonius')
self.login_user(self.user)
result = self.client_get('/stats')
self.assert_json_error(result, "Not allowed for guest users", 400)
result = self.client_get('/json/analytics/chart_data')
self.assert_json_error(result, "Not allowed for guest users", 400)
def test_stats_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
result = self.client_get('/stats/realm/zulip/')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
result = self.client_get('/stats/realm/not_existing_realm/')
self.assertEqual(result.status_code, 302)
result = self.client_get('/stats/realm/zulip/')
self.assertEqual(result.status_code, 200)
self.assert_in_response("Zulip analytics for", result)
def test_stats_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 200)
self.assert_in_response("Zulip analytics for", result)
class TestGetChartData(ZulipTestCase): class TestGetChartData(ZulipTestCase):
def setUp(self) -> None: def setUp(self) -> None:
super().setUp()
self.realm = get_realm('zulip') self.realm = get_realm('zulip')
self.user = self.example_user('hamlet') self.user = self.example_user('hamlet')
self.login_user(self.user) self.login(self.user.email)
self.end_times_hour = [ceiling_to_hour(self.realm.date_created) + timedelta(hours=i) self.end_times_hour = [ceiling_to_hour(self.realm.date_created) + timedelta(hours=i)
for i in range(4)] for i in range(4)]
self.end_times_day = [ceiling_to_day(self.realm.date_created) + timedelta(days=i) self.end_times_day = [ceiling_to_day(self.realm.date_created) + timedelta(days=i)
@@ -114,10 +59,6 @@ class TestGetChartData(ZulipTestCase):
def test_number_of_humans(self) -> None: def test_number_of_humans(self) -> None:
stat = COUNT_STATS['realm_active_humans::day'] stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], []) self.insert_data(stat, [None], [])
stat = COUNT_STATS['1day_actives::day']
self.insert_data(stat, [None], [])
stat = COUNT_STATS['active_users_audit:is_bot:day']
self.insert_data(stat, ['false'], [])
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'number_of_humans'}) {'chart_name': 'number_of_humans'})
self.assert_json_success(result) self.assert_json_success(result)
@@ -126,7 +67,7 @@ class TestGetChartData(ZulipTestCase):
'msg': '', 'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day], 'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day],
'frequency': CountStat.DAY, 'frequency': CountStat.DAY,
'everyone': {'_1day': self.data(100), '_15day': self.data(100), 'all_time': self.data(100)}, 'realm': {'human': self.data(100)},
'display_order': None, 'display_order': None,
'result': 'success', 'result': 'success',
}) })
@@ -142,7 +83,7 @@ class TestGetChartData(ZulipTestCase):
'msg': '', 'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_hour], 'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_hour],
'frequency': CountStat.HOUR, 'frequency': CountStat.HOUR,
'everyone': {'bot': self.data(100), 'human': self.data(101)}, 'realm': {'bot': self.data(100), 'human': self.data(101)},
'user': {'bot': self.data(0), 'human': self.data(200)}, 'user': {'bot': self.data(0), 'human': self.data(200)},
'display_order': None, 'display_order': None,
'result': 'success', 'result': 'success',
@@ -160,8 +101,8 @@ class TestGetChartData(ZulipTestCase):
'msg': '', 'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day], 'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day],
'frequency': CountStat.DAY, 'frequency': CountStat.DAY,
'everyone': {'Public streams': self.data(100), 'Private streams': self.data(0), 'realm': {'Public streams': self.data(100), 'Private streams': self.data(0),
'Private messages': self.data(101), 'Group private messages': self.data(0)}, 'Private messages': self.data(101), 'Group private messages': self.data(0)},
'user': {'Public streams': self.data(200), 'Private streams': self.data(201), 'user': {'Public streams': self.data(200), 'Private streams': self.data(201),
'Private messages': self.data(0), 'Group private messages': self.data(0)}, 'Private messages': self.data(0), 'Group private messages': self.data(0)},
'display_order': ['Private messages', 'Public streams', 'Private streams', 'Group private messages'], 'display_order': ['Private messages', 'Public streams', 'Private streams', 'Group private messages'],
@@ -184,30 +125,13 @@ class TestGetChartData(ZulipTestCase):
'msg': '', 'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day], 'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_day],
'frequency': CountStat.DAY, 'frequency': CountStat.DAY,
'everyone': {'client 4': self.data(100), 'client 3': self.data(101), 'realm': {'client 4': self.data(100), 'client 3': self.data(101),
'client 2': self.data(102)}, 'client 2': self.data(102)},
'user': {'client 3': self.data(200), 'client 1': self.data(201)}, 'user': {'client 3': self.data(200), 'client 1': self.data(201)},
'display_order': ['client 1', 'client 2', 'client 3', 'client 4'], 'display_order': ['client 1', 'client 2', 'client 3', 'client 4'],
'result': 'success', 'result': 'success',
}) })
def test_messages_read_over_time(self) -> None:
stat = COUNT_STATS['messages_read::hour']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_read_over_time'})
self.assert_json_success(result)
data = result.json()
self.assertEqual(data, {
'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_hour],
'frequency': CountStat.HOUR,
'everyone': {'read': self.data(100)},
'user': {'read': self.data(0)},
'display_order': None,
'result': 'success',
})
def test_include_empty_subgroups(self) -> None: def test_include_empty_subgroups(self) -> None:
FillState.objects.create( FillState.objects.create(
property='realm_active_humans::day', end_time=self.end_times_day[0], property='realm_active_humans::day', end_time=self.end_times_day[0],
@@ -216,7 +140,7 @@ class TestGetChartData(ZulipTestCase):
{'chart_name': 'number_of_humans'}) {'chart_name': 'number_of_humans'})
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['everyone'], {"_1day": [0], "_15day": [0], "all_time": [0]}) self.assertEqual(data['realm'], {'human': [0]})
self.assertFalse('user' in data) self.assertFalse('user' in data)
FillState.objects.create( FillState.objects.create(
@@ -226,7 +150,7 @@ class TestGetChartData(ZulipTestCase):
{'chart_name': 'messages_sent_over_time'}) {'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['everyone'], {'human': [0], 'bot': [0]}) self.assertEqual(data['realm'], {'human': [0], 'bot': [0]})
self.assertEqual(data['user'], {'human': [0], 'bot': [0]}) self.assertEqual(data['user'], {'human': [0], 'bot': [0]})
FillState.objects.create( FillState.objects.create(
@@ -236,7 +160,7 @@ class TestGetChartData(ZulipTestCase):
{'chart_name': 'messages_sent_by_message_type'}) {'chart_name': 'messages_sent_by_message_type'})
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['everyone'], { self.assertEqual(data['realm'], {
'Public streams': [0], 'Private streams': [0], 'Public streams': [0], 'Private streams': [0],
'Private messages': [0], 'Group private messages': [0]}) 'Private messages': [0], 'Group private messages': [0]})
self.assertEqual(data['user'], { self.assertEqual(data['user'], {
@@ -250,16 +174,12 @@ class TestGetChartData(ZulipTestCase):
{'chart_name': 'messages_sent_by_client'}) {'chart_name': 'messages_sent_by_client'})
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['everyone'], {}) self.assertEqual(data['realm'], {})
self.assertEqual(data['user'], {}) self.assertEqual(data['user'], {})
def test_start_and_end(self) -> None: def test_start_and_end(self) -> None:
stat = COUNT_STATS['realm_active_humans::day'] stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], []) self.insert_data(stat, [None], [])
stat = COUNT_STATS['1day_actives::day']
self.insert_data(stat, [None], [])
stat = COUNT_STATS['active_users_audit:is_bot:day']
self.insert_data(stat, ['false'], [])
end_time_timestamps = [datetime_to_timestamp(dt) for dt in self.end_times_day] end_time_timestamps = [datetime_to_timestamp(dt) for dt in self.end_times_day]
# valid start and end # valid start and end
@@ -270,7 +190,7 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['end_times'], end_time_timestamps[1:3]) self.assertEqual(data['end_times'], end_time_timestamps[1:3])
self.assertEqual(data['everyone'], {'_1day': [0, 100], '_15day': [0, 100], 'all_time': [0, 100]}) self.assertEqual(data['realm'], {'human': [0, 100]})
# start later then end # start later then end
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
@@ -282,10 +202,6 @@ class TestGetChartData(ZulipTestCase):
def test_min_length(self) -> None: def test_min_length(self) -> None:
stat = COUNT_STATS['realm_active_humans::day'] stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], []) self.insert_data(stat, [None], [])
stat = COUNT_STATS['1day_actives::day']
self.insert_data(stat, [None], [])
stat = COUNT_STATS['active_users_audit:is_bot:day']
self.insert_data(stat, ['false'], [])
# test min_length is too short to change anything # test min_length is too short to change anything
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'number_of_humans', {'chart_name': 'number_of_humans',
@@ -293,7 +209,7 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_success(result) self.assert_json_success(result)
data = result.json() data = result.json()
self.assertEqual(data['end_times'], [datetime_to_timestamp(dt) for dt in self.end_times_day]) self.assertEqual(data['end_times'], [datetime_to_timestamp(dt) for dt in self.end_times_day])
self.assertEqual(data['everyone'], {'_1day': self.data(100), '_15day': self.data(100), 'all_time': self.data(100)}) self.assertEqual(data['realm'], {'human': self.data(100)})
# test min_length larger than filled data # test min_length larger than filled data
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'number_of_humans', {'chart_name': 'number_of_humans',
@@ -302,7 +218,7 @@ class TestGetChartData(ZulipTestCase):
data = result.json() data = result.json()
end_times = [ceiling_to_day(self.realm.date_created) + timedelta(days=i) for i in range(-1, 4)] end_times = [ceiling_to_day(self.realm.date_created) + timedelta(days=i) for i in range(-1, 4)]
self.assertEqual(data['end_times'], [datetime_to_timestamp(dt) for dt in end_times]) self.assertEqual(data['end_times'], [datetime_to_timestamp(dt) for dt in end_times])
self.assertEqual(data['everyone'], {'_1day': [0]+self.data(100), '_15day': [0]+self.data(100), 'all_time': [0]+self.data(100)}) self.assertEqual(data['realm'], {'human': [0]+self.data(100)})
def test_non_existent_chart(self) -> None: def test_non_existent_chart(self) -> None:
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
@@ -310,405 +226,20 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_error_contains(result, 'Unknown chart name') self.assert_json_error_contains(result, 'Unknown chart name')
def test_analytics_not_running(self) -> None: def test_analytics_not_running(self) -> None:
realm = get_realm("zulip") # try to get data for a valid chart, but before we've put anything in the database
# (e.g. before update_analytics_counts has been run)
self.assertEqual(FillState.objects.count(), 0)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'): with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data', result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'}) {'chart_name': 'number_of_humans'})
self.assert_json_error_contains(result, 'No analytics data available') self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(hours=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=5)
fill_state = FillState.objects.create(property='messages_sent:is_bot:hour', end_time=end_time,
state=FillState.DONE)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=2)
fill_state.end_time = end_time
fill_state.save(update_fields=["end_time"])
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data', {'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
def test_get_chart_data_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
result = self.client_get('/json/analytics/chart_data/realm/zulip',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data/realm/not_existing_realm',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, 'Invalid organization', 400)
result = self.client_get('/json/analytics/chart_data/realm/zulip',
{'chart_name': 'number_of_humans'})
self.assert_json_success(result)
def test_get_chart_data_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
result = self.client_get('/json/analytics/chart_data/installation',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data/installation',
{'chart_name': 'number_of_humans'})
self.assert_json_success(result)
class TestSupportEndpoint(ZulipTestCase):
def test_search(self) -> None:
reset_emails_in_zulip_realm()
def check_hamlet_user_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">user</span>\n', '<h3>King Hamlet</h3>',
'<b>Email</b>: hamlet@zulip.com', '<b>Is active</b>: True<br>',
'<b>Admins</b>: desdemona@zulip.com, iago@zulip.com\n',
'class="copy-button" data-copytext="desdemona@zulip.com, iago@zulip.com"',
], result)
def check_zulip_realm_query_result(result: HttpResponse) -> None:
zulip_realm = get_realm("zulip")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{zulip_realm.id}"',
'Zulip Dev</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
'input type="number" name="discount" value="None"',
'<option value="active" selected>Active</option>',
'<option value="deactivated" >Deactivated</option>',
'scrub-realm-button">',
'data-string-id="zulip"'], result)
def check_lear_realm_query_result(result: HttpResponse) -> None:
lear_realm = get_realm("lear")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{lear_realm.id}"',
'Lear &amp; Co.</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
'input type="number" name="discount" value="None"',
'<option value="active" selected>Active</option>',
'<option value="deactivated" >Deactivated</option>',
'scrub-realm-button">',
'data-string-id="lear"',
'<b>Name</b>: Zulip Standard',
'<b>Status</b>: Active',
'<b>Billing schedule</b>: Annual',
'<b>Licenses</b>: 2/10 (Manual)',
'<b>Price per license</b>: $80.0',
'<b>Payment method</b>: Send invoice',
'<b>Next invoice date</b>: 02 January 2017',
], result)
def check_preregistration_user_query_result(result: HttpResponse, email: str, invite: bool=False) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
f'<b>Email</b>: {email}',
], result)
if invite:
self.assert_in_success_response(['<span class="label">invite</span>'], result)
self.assert_in_success_response(['<b>Expires in</b>: 1\xa0week, 3',
'<b>Status</b>: Link has never been clicked'], result)
self.assert_in_success_response([], result)
else:
self.assert_not_in_success_response(['<span class="label">invite</span>'], result)
self.assert_in_success_response(['<b>Expires in</b>: 1\xa0day',
'<b>Status</b>: Link has never been clicked'], result)
def check_realm_creation_query_result(result: HttpResponse, email: str) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
'<span class="label">realm creation</span>\n',
'<b>Link</b>: http://testserver/accounts/do_confirm/',
'<b>Expires in</b>: 1\xa0day<br>\n',
], result)
def check_multiuse_invite_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">multiuse invite</span>\n',
'<b>Link</b>: http://zulip.testserver/join/',
'<b>Expires in</b>: 1\xa0week, 3',
], result)
def check_realm_reactivation_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">realm reactivation</span>\n',
'<b>Link</b>: http://zulip.testserver/reactivate/',
'<b>Expires in</b>: 1\xa0day',
], result)
self.login('cordelia')
result = self.client_get("/activity/support")
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
customer = Customer.objects.create(realm=get_realm("lear"), stripe_customer_id='cus_123')
now = datetime(2016, 1, 2, tzinfo=timezone.utc)
plan = CustomerPlan.objects.create(customer=customer, billing_cycle_anchor=now,
billing_schedule=CustomerPlan.ANNUAL, tier=CustomerPlan.STANDARD,
price_per_license=8000, next_invoice_date=add_months(now, 12))
LicenseLedger.objects.create(licenses=10, licenses_at_next_renewal=10, event_time=timezone_now(),
is_renewal=True, plan=plan)
result = self.client_get("/activity/support")
self.assert_in_success_response(['<input type="text" name="q" class="input-xxlarge search-query"'], result)
result = self.client_get("/activity/support", {"q": "hamlet@zulip.com"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "lear"})
check_lear_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "http://lear.testserver"})
check_lear_realm_query_result(result)
with self.settings(REALM_HOSTS={'zulip': 'localhost'}):
result = self.client_get("/activity/support", {"q": "http://localhost"})
check_zulip_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "hamlet@zulip.com, lear"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "lear, Hamlet <hamlet@zulip.com>"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
self.client_post('/accounts/home/', {'email': self.nonreg_email("test")})
self.login('iago')
result = self.client_get("/activity/support", {"q": self.nonreg_email("test")})
check_preregistration_user_query_result(result, self.nonreg_email("test"))
check_zulip_realm_query_result(result)
stream_ids = [self.get_stream_id("Denmark")]
invitee_emails = [self.nonreg_email("test1")]
self.client_post("/json/invites", {"invitee_emails": invitee_emails,
"stream_ids": orjson.dumps(stream_ids).decode(),
"invite_as": PreregistrationUser.INVITE_AS['MEMBER']})
result = self.client_get("/activity/support", {"q": self.nonreg_email("test1")})
check_preregistration_user_query_result(result, self.nonreg_email("test1"), invite=True)
check_zulip_realm_query_result(result)
email = self.nonreg_email('alice')
self.client_post('/new/', {'email': email})
result = self.client_get("/activity/support", {"q": email})
check_realm_creation_query_result(result, email)
do_create_multiuse_invite_link(self.example_user("hamlet"), invited_as=1)
result = self.client_get("/activity/support", {"q": "zulip"})
check_multiuse_invite_link_query_result(result)
check_zulip_realm_query_result(result)
MultiuseInvite.objects.all().delete()
do_send_realm_reactivation_email(get_realm("zulip"))
result = self.client_get("/activity/support", {"q": "zulip"})
check_realm_reactivation_link_query_result(result)
check_zulip_realm_query_result(result)
def test_change_plan_type(self) -> None:
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{cordelia.realm_id}", "plan_type": "2"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
with mock.patch("analytics.views.do_change_plan_type") as m:
result = self.client_post("/activity/support", {"realm_id": f"{iago.realm_id}", "plan_type": "2"})
m.assert_called_once_with(get_realm("zulip"), 2)
self.assert_in_success_response(["Plan type of Zulip Dev changed from self hosted to limited"], result)
def test_attach_discount(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.attach_discount_to_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
m.assert_called_once_with(get_realm("lear"), 25)
self.assert_in_success_response(["Discount of Lear &amp; Co. changed to 25 from None"], result)
def test_change_sponsorship_status(self) -> None:
lear_realm = get_realm("lear")
self.assertIsNone(get_customer_by_realm(lear_realm))
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assert_in_success_response(["Lear &amp; Co. marked as pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertTrue(customer.sponsorship_pending)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "false"})
self.assert_in_success_response(["Lear &amp; Co. is no longer pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertFalse(customer.sponsorship_pending)
def test_approve_sponsorship(self) -> None:
lear_realm = get_realm("lear")
update_sponsorship_status(lear_realm, True)
king_user = self.lear_user("king")
king_user.role = UserProfile.ROLE_REALM_OWNER
king_user.save()
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"approve_sponsorship": "approve_sponsorship"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"approve_sponsorship": "approve_sponsorship"})
self.assert_in_success_response(["Sponsorship approved for Lear &amp; Co."], result)
lear_realm.refresh_from_db()
self.assertEqual(lear_realm.plan_type, Realm.STANDARD_FREE)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertFalse(customer.sponsorship_pending)
messages = UserMessage.objects.filter(user_profile=king_user)
self.assertIn("request for sponsored hosting has been approved", messages[0].message.content)
self.assertEqual(len(messages), 1)
def test_activate_or_deactivate_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.do_deactivate_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
m.assert_called_once_with(lear_realm, self.example_user("iago"))
self.assert_in_success_response(["Lear &amp; Co. deactivated"], result)
with mock.patch("analytics.views.do_send_realm_reactivation_email") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "active"})
m.assert_called_once_with(lear_realm)
self.assert_in_success_response(["Realm reactivation email sent to admins of Lear"], result)
def test_scrub_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.do_scrub_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "scrub_realm": "scrub_realm"})
m.assert_called_once_with(lear_realm, acting_user=self.example_user("iago"))
self.assert_in_success_response(["Lear &amp; Co. scrubbed"], result)
with mock.patch("analytics.views.do_scrub_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}"})
self.assert_json_error(result, "Invalid parameters")
m.assert_not_called()
class TestGetChartDataHelpers(ZulipTestCase): class TestGetChartDataHelpers(ZulipTestCase):
# last_successful_fill is in analytics/models.py, but get_chart_data is # last_successful_fill is in analytics/models.py, but get_chart_data is
# the only function that uses it at the moment # the only function that uses it at the moment
def test_last_successful_fill(self) -> None: def test_last_successful_fill(self) -> None:
self.assertIsNone(last_successful_fill('non-existant')) self.assertIsNone(last_successful_fill('non-existant'))
a_time = datetime(2016, 3, 14, 19, tzinfo=timezone.utc) a_time = datetime(2016, 3, 14, 19).replace(tzinfo=utc)
one_hour_before = datetime(2016, 3, 14, 18, tzinfo=timezone.utc) one_hour_before = datetime(2016, 3, 14, 18).replace(tzinfo=utc)
fillstate = FillState.objects.create(property='property', end_time=a_time, fillstate = FillState.objects.create(property='property', end_time=a_time,
state=FillState.DONE) state=FillState.DONE)
self.assertEqual(last_successful_fill('property'), a_time) self.assertEqual(last_successful_fill('property'), a_time)
@@ -717,12 +248,12 @@ class TestGetChartDataHelpers(ZulipTestCase):
self.assertEqual(last_successful_fill('property'), one_hour_before) self.assertEqual(last_successful_fill('property'), one_hour_before)
def test_sort_by_totals(self) -> None: def test_sort_by_totals(self) -> None:
empty: List[int] = [] empty = [] # type: List[int]
value_arrays = {'c': [0, 1], 'a': [9], 'b': [1, 1, 1], 'd': empty} value_arrays = {'c': [0, 1], 'a': [9], 'b': [1, 1, 1], 'd': empty}
self.assertEqual(sort_by_totals(value_arrays), ['a', 'b', 'c', 'd']) self.assertEqual(sort_by_totals(value_arrays), ['a', 'b', 'c', 'd'])
def test_sort_client_labels(self) -> None: def test_sort_client_labels(self) -> None:
data = {'everyone': {'a': [16], 'c': [15], 'b': [14], 'e': [13], 'd': [12], 'h': [11]}, data = {'realm': {'a': [16], 'c': [15], 'b': [14], 'e': [13], 'd': [12], 'h': [11]},
'user': {'a': [6], 'b': [5], 'd': [4], 'e': [3], 'f': [2], 'g': [1]}} 'user': {'a': [6], 'b': [5], 'd': [4], 'e': [3], 'f': [2], 'g': [1]}}
self.assertEqual(sort_client_labels(data), ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h']) self.assertEqual(sort_client_labels(data), ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h'])
@@ -731,9 +262,9 @@ class TestTimeRange(ZulipTestCase):
HOUR = timedelta(hours=1) HOUR = timedelta(hours=1)
DAY = timedelta(days=1) DAY = timedelta(days=1)
a_time = datetime(2016, 3, 14, 22, 59, tzinfo=timezone.utc) a_time = datetime(2016, 3, 14, 22, 59).replace(tzinfo=utc)
floor_hour = datetime(2016, 3, 14, 22, tzinfo=timezone.utc) floor_hour = datetime(2016, 3, 14, 22).replace(tzinfo=utc)
floor_day = datetime(2016, 3, 14, tzinfo=timezone.utc) floor_day = datetime(2016, 3, 14).replace(tzinfo=utc)
# test start == end # test start == end
self.assertEqual(time_range(a_time, a_time, CountStat.HOUR, None), []) self.assertEqual(time_range(a_time, a_time, CountStat.HOUR, None), [])

View File

@@ -1,34 +1,20 @@
from django.conf.urls import include from django.conf.urls import include, url
from django.urls import path
import analytics.views import analytics.views
from zerver.lib.rest import rest_dispatch from zerver.lib.rest import rest_dispatch
i18n_urlpatterns = [ i18n_urlpatterns = [
# Server admin (user_profile.is_staff) visible stats pages # Server admin (user_profile.is_staff) visible stats pages
path('activity', analytics.views.get_activity, url(r'^activity$', analytics.views.get_activity,
name='analytics.views.get_activity'), name='analytics.views.get_activity'),
path('activity/support', analytics.views.support, url(r'^realm_activity/(?P<realm_str>[\S]+)/$', analytics.views.get_realm_activity,
name='analytics.views.support'), name='analytics.views.get_realm_activity'),
path('realm_activity/<str:realm_str>/', analytics.views.get_realm_activity, url(r'^user_activity/(?P<email>[\S]+)/$', analytics.views.get_user_activity,
name='analytics.views.get_realm_activity'), name='analytics.views.get_user_activity'),
path('user_activity/<str:email>/', analytics.views.get_user_activity,
name='analytics.views.get_user_activity'),
path('stats/realm/<str:realm_str>/', analytics.views.stats_for_realm,
name='analytics.views.stats_for_realm'),
path('stats/installation', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
path('stats/remote/<int:remote_server_id>/installation',
analytics.views.stats_for_remote_installation,
name='analytics.views.stats_for_remote_installation'),
path('stats/remote/<int:remote_server_id>/realm/<int:remote_realm_id>/',
analytics.views.stats_for_remote_realm,
name='analytics.views.stats_for_remote_realm'),
# User-visible stats page # User-visible stats page
path('stats', analytics.views.stats, url(r'^stats$', analytics.views.stats,
name='analytics.views.stats'), name='analytics.views.stats'),
] ]
# These endpoints are a part of the API (V1), which uses: # These endpoints are a part of the API (V1), which uses:
@@ -41,22 +27,13 @@ i18n_urlpatterns = [
# All of these paths are accessed by either a /json or /api prefix # All of these paths are accessed by either a /json or /api prefix
v1_api_and_json_patterns = [ v1_api_and_json_patterns = [
# get data for the graphs at /stats # get data for the graphs at /stats
path('analytics/chart_data', rest_dispatch, url(r'^analytics/chart_data$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data'}), {'GET': 'analytics.views.get_chart_data'}),
path('analytics/chart_data/realm/<str:realm_str>', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_realm'}),
path('analytics/chart_data/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/realm/<int:remote_realm_id>',
rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_realm'}),
] ]
i18n_urlpatterns += [ i18n_urlpatterns += [
path('api/v1/', include(v1_api_and_json_patterns)), url(r'^api/v1/', include(v1_api_and_json_patterns)),
path('json/', include(v1_api_and_json_patterns)), url(r'^json/', include(v1_api_and_json_patterns)),
] ]
urlpatterns = i18n_urlpatterns urlpatterns = i18n_urlpatterns

File diff suppressed because it is too large Load Diff

View File

@@ -1,20 +0,0 @@
"use strict";
module.exports = {
presets: [
[
"@babel/preset-env",
{
corejs: 3,
loose: true, // Loose mode for…of loops are 5× faster in Firefox
useBuiltIns: "usage",
},
],
"@babel/typescript",
],
plugins: [
"@babel/proposal-class-properties",
["@babel/plugin-proposal-unicode-property-regex", {useUnicodeFlag: false}],
],
sourceType: "unambiguous",
};

View File

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
# Permission is hereby granted, free of charge, to any person obtaining a # Permission is hereby granted, free of charge, to any person obtaining a

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.db.models.deletion import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.utils.timezone import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration): class Migration(migrations.Migration):

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-17 09:16 # Generated by Django 1.10.4 on 2017-01-17 09:16
from django.db import migrations from django.db import migrations

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-08 04:23 # Generated by Django 1.11.2 on 2017-07-08 04:23
from django.db import migrations, models from django.db import migrations, models

View File

@@ -1,6 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2017-11-30 00:13 # Generated by Django 1.11.6 on 2017-11-30 00:13
import django.db.models.deletion from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration): class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 18:39 # Generated by Django 1.11.6 on 2018-01-29 18:39
from __future__ import unicode_literals
from django.db import migrations, models from django.db import migrations, models

View File

@@ -1,37 +0,0 @@
# Generated by Django 2.2.10 on 2020-03-27 09:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('confirmation', '0006_realmcreationkey_presume_email_valid'),
]
operations = [
migrations.AlterField(
model_name='confirmation',
name='confirmation_key',
field=models.CharField(db_index=True, max_length=40),
),
migrations.AlterField(
model_name='confirmation',
name='date_sent',
field=models.DateTimeField(db_index=True),
),
migrations.AlterField(
model_name='confirmation',
name='object_id',
field=models.PositiveIntegerField(db_index=True),
),
migrations.AlterField(
model_name='realmcreationkey',
name='creation_key',
field=models.CharField(db_index=True, max_length=40, verbose_name='activation key'),
),
migrations.AlterUniqueTogether(
name='confirmation',
unique_together={('type', 'confirmation_key')},
),
]

View File

@@ -1,24 +1,28 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $' __revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $'
import datetime
import string
from random import SystemRandom
from typing import Mapping, Optional, Union
from urllib.parse import urljoin
from django.conf import settings import datetime
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
from django.db import models from django.db import models
from django.db.models import CASCADE from django.db.models import CASCADE
from django.urls import reverse
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.http import HttpRequest, HttpResponse from django.http import HttpRequest, HttpResponse
from django.shortcuts import render from django.shortcuts import render
from django.urls import reverse
from django.utils.timezone import now as timezone_now from django.utils.timezone import now as timezone_now
from zerver.models import EmailChangeStatus, MultiuseInvite, PreregistrationUser, Realm, UserProfile from zerver.lib.send_email import send_email
from zerver.lib.utils import generate_random_token
from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite, \
UserProfile, Realm
from random import SystemRandom
import string
from typing import Any, Dict, Optional, Text, Union
class ConfirmationKeyException(Exception): class ConfirmationKeyException(Exception):
WRONG_LENGTH = 1 WRONG_LENGTH = 1
@@ -43,8 +47,7 @@ def generate_key() -> str:
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus] ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(confirmation_key: str, def get_object_from_key(confirmation_key: str,
confirmation_type: int, confirmation_type: int) -> ConfirmationObjT:
activate_object: bool=True) -> ConfirmationObjT:
# Confirmation keys used to be 40 characters # Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40): if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH) raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
@@ -59,42 +62,35 @@ def get_object_from_key(confirmation_key: str,
raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED) raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED)
obj = confirmation.content_object obj = confirmation.content_object
if activate_object and hasattr(obj, "status"): if hasattr(obj, "status"):
obj.status = getattr(settings, 'STATUS_ACTIVE', 1) obj.status = getattr(settings, 'STATUS_ACTIVE', 1)
obj.save(update_fields=['status']) obj.save(update_fields=['status'])
return obj return obj
def create_confirmation_link(obj: ContentType, def create_confirmation_link(obj: ContentType, host: str,
confirmation_type: int, confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str: url_args: Optional[Dict[str, str]]=None) -> str:
key = generate_key() key = generate_key()
realm = None
if hasattr(obj, 'realm'):
realm = obj.realm
elif isinstance(obj, Realm):
realm = obj
Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key, Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key,
realm=realm, type=confirmation_type) realm=obj.realm, type=confirmation_type)
return confirmation_url(key, realm, confirmation_type, url_args) return confirmation_url(key, host, confirmation_type, url_args)
def confirmation_url(confirmation_key: str, realm: Optional[Realm], def confirmation_url(confirmation_key: str, host: str,
confirmation_type: int, confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str: url_args: Optional[Dict[str, str]]=None) -> str:
url_args = dict(url_args) if url_args is None:
url_args = {}
url_args['confirmation_key'] = confirmation_key url_args['confirmation_key'] = confirmation_key
return urljoin( return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME, host,
settings.ROOT_DOMAIN_URI if realm is None else realm.uri, reverse(_properties[confirmation_type].url_name, kwargs=url_args))
reverse(_properties[confirmation_type].url_name, kwargs=url_args),
)
class Confirmation(models.Model): class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE) content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
object_id: int = models.PositiveIntegerField(db_index=True) object_id = models.PositiveIntegerField() # type: int
content_object = GenericForeignKey('content_type', 'object_id') content_object = GenericForeignKey('content_type', 'object_id')
date_sent: datetime.datetime = models.DateTimeField(db_index=True) date_sent = models.DateTimeField() # type: datetime.datetime
confirmation_key: str = models.CharField(max_length=40, db_index=True) confirmation_key = models.CharField(max_length=40) # type: str
realm: Optional[Realm] = models.ForeignKey(Realm, null=True, on_delete=CASCADE) realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE) # type: Optional[Realm]
# The following list is the set of valid types # The following list is the set of valid types
USER_REGISTRATION = 1 USER_REGISTRATION = 1
@@ -104,14 +100,10 @@ class Confirmation(models.Model):
SERVER_REGISTRATION = 5 SERVER_REGISTRATION = 5
MULTIUSE_INVITE = 6 MULTIUSE_INVITE = 6
REALM_CREATION = 7 REALM_CREATION = 7
REALM_REACTIVATION = 8 type = models.PositiveSmallIntegerField() # type: int
type: int = models.PositiveSmallIntegerField()
def __str__(self) -> str: def __str__(self) -> Text:
return f'<Confirmation: {self.content_object}>' return '<Confirmation: %s>' % (self.content_object,)
class Meta:
unique_together = ("type", "confirmation_key")
class ConfirmationType: class ConfirmationType:
def __init__(self, url_name: str, def __init__(self, url_name: str,
@@ -130,18 +122,8 @@ _properties = {
'zerver.views.registration.accounts_home_from_multiuse_invite', 'zerver.views.registration.accounts_home_from_multiuse_invite',
validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS), validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
Confirmation.REALM_CREATION: ConfirmationType('check_prereg_key_and_redirect'), Confirmation.REALM_CREATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.REALM_REACTIVATION: ConfirmationType('zerver.views.realm.realm_reactivation'),
} }
def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> str:
"""
Generate a unique link that a logged-out user can visit to unsubscribe from
Zulip e-mails without having to first log in.
"""
return create_confirmation_link(user_profile,
Confirmation.UNSUBSCRIBE,
url_args = {'email_type': email_type})
# Functions related to links generated by the generate_realm_creation_link.py # Functions related to links generated by the generate_realm_creation_link.py
# management command. # management command.
# Note that being validated here will just allow the user to access the create_realm # Note that being validated here will just allow the user to access the create_realm
@@ -163,23 +145,23 @@ def validate_key(creation_key: Optional[str]) -> Optional['RealmCreationKey']:
raise RealmCreationKey.Invalid() raise RealmCreationKey.Invalid()
return key_record return key_record
def generate_realm_creation_url(by_admin: bool=False) -> str: def generate_realm_creation_url(by_admin: bool=False) -> Text:
key = generate_key() key = generate_key()
RealmCreationKey.objects.create(creation_key=key, RealmCreationKey.objects.create(creation_key=key,
date_created=timezone_now(), date_created=timezone_now(),
presume_email_valid=by_admin) presume_email_valid=by_admin)
return urljoin( return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.ROOT_DOMAIN_URI, settings.EXTERNAL_HOST,
reverse('zerver.views.create_realm', kwargs={'creation_key': key}), reverse('zerver.views.create_realm',
) kwargs={'creation_key': key}))
class RealmCreationKey(models.Model): class RealmCreationKey(models.Model):
creation_key = models.CharField('activation key', db_index=True, max_length=40) creation_key = models.CharField('activation key', max_length=40)
date_created = models.DateTimeField('created', default=timezone_now) date_created = models.DateTimeField('created', default=timezone_now)
# True just if we should presume the email address the user enters # True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that. # is theirs, and skip sending mail to it to confirm that.
presume_email_valid: bool = models.BooleanField(default=False) presume_email_valid = models.BooleanField(default=False) # type: bool
class Invalid(Exception): class Invalid(Exception):
pass pass

View File

@@ -1,6 +1,9 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com> # Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
from typing import Any, Dict
__revision__ = '$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $' __revision__ = '$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $'
STATUS_ACTIVE = 1 STATUS_ACTIVE = 1
STATUS_REVOKED = 2

View File

@@ -1,634 +0,0 @@
import logging
import math
import os
from datetime import datetime, timedelta
from decimal import Decimal
from functools import wraps
from typing import Callable, Dict, Optional, Tuple, TypeVar, cast
import orjson
import stripe
from django.conf import settings
from django.core.signing import Signer
from django.db import transaction
from django.utils.timezone import now as timezone_now
from django.utils.translation import override as override_language
from django.utils.translation import ugettext as _
from corporate.models import (
Customer,
CustomerPlan,
LicenseLedger,
get_current_plan_by_customer,
get_current_plan_by_realm,
get_customer_by_realm,
)
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import datetime_to_timestamp, timestamp_to_datetime
from zerver.lib.utils import generate_random_token
from zerver.models import Realm, RealmAuditLog, UserProfile, get_system_bot
from zproject.config import get_secret
STRIPE_PUBLISHABLE_KEY = get_secret('stripe_publishable_key')
stripe.api_key = get_secret('stripe_secret_key')
BILLING_LOG_PATH = os.path.join('/var/log/zulip'
if not settings.DEVELOPMENT
else settings.DEVELOPMENT_LOG_DIRECTORY,
'billing.log')
billing_logger = logging.getLogger('corporate.stripe')
log_to_file(billing_logger, BILLING_LOG_PATH)
log_to_file(logging.getLogger('stripe'), BILLING_LOG_PATH)
CallableT = TypeVar('CallableT', bound=Callable[..., object])
MIN_INVOICED_LICENSES = 30
MAX_INVOICED_LICENSES = 1000
DEFAULT_INVOICE_DAYS_UNTIL_DUE = 30
def get_latest_seat_count(realm: Realm) -> int:
non_guests = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False).exclude(role=UserProfile.ROLE_GUEST).count()
guests = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False, role=UserProfile.ROLE_GUEST).count()
return max(non_guests, math.ceil(guests / 5))
def sign_string(string: str) -> Tuple[str, str]:
salt = generate_random_token(64)
signer = Signer(salt=salt)
return signer.sign(string), salt
def unsign_string(signed_string: str, salt: str) -> str:
signer = Signer(salt=salt)
return signer.unsign(signed_string)
# Be extremely careful changing this function. Historical billing periods
# are not stored anywhere, and are just computed on the fly using this
# function. Any change you make here should return the same value (or be
# within a few seconds) for basically any value from when the billing system
# went online to within a year from now.
def add_months(dt: datetime, months: int) -> datetime:
assert(months >= 0)
# It's fine that the max day in Feb is 28 for leap years.
MAX_DAY_FOR_MONTH = {1: 31, 2: 28, 3: 31, 4: 30, 5: 31, 6: 30,
7: 31, 8: 31, 9: 30, 10: 31, 11: 30, 12: 31}
year = dt.year
month = dt.month + months
while month > 12:
year += 1
month -= 12
day = min(dt.day, MAX_DAY_FOR_MONTH[month])
# datetimes don't support leap seconds, so don't need to worry about those
return dt.replace(year=year, month=month, day=day)
def next_month(billing_cycle_anchor: datetime, dt: datetime) -> datetime:
estimated_months = round((dt - billing_cycle_anchor).days * 12. / 365)
for months in range(max(estimated_months - 1, 0), estimated_months + 2):
proposed_next_month = add_months(billing_cycle_anchor, months)
if 20 < (proposed_next_month - dt).days < 40:
return proposed_next_month
raise AssertionError('Something wrong in next_month calculation with '
f'billing_cycle_anchor: {billing_cycle_anchor}, dt: {dt}')
def start_of_next_billing_cycle(plan: CustomerPlan, event_time: datetime) -> datetime:
if plan.status == CustomerPlan.FREE_TRIAL:
assert(plan.next_invoice_date is not None) # for mypy
return plan.next_invoice_date
months_per_period = {
CustomerPlan.ANNUAL: 12,
CustomerPlan.MONTHLY: 1,
}[plan.billing_schedule]
periods = 1
dt = plan.billing_cycle_anchor
while dt <= event_time:
dt = add_months(plan.billing_cycle_anchor, months_per_period * periods)
periods += 1
return dt
def next_invoice_date(plan: CustomerPlan) -> Optional[datetime]:
if plan.status == CustomerPlan.ENDED:
return None
assert(plan.next_invoice_date is not None) # for mypy
months_per_period = {
CustomerPlan.ANNUAL: 12,
CustomerPlan.MONTHLY: 1,
}[plan.billing_schedule]
if plan.automanage_licenses:
months_per_period = 1
periods = 1
dt = plan.billing_cycle_anchor
while dt <= plan.next_invoice_date:
dt = add_months(plan.billing_cycle_anchor, months_per_period * periods)
periods += 1
return dt
def renewal_amount(plan: CustomerPlan, event_time: datetime) -> int: # nocoverage: TODO
if plan.fixed_price is not None:
return plan.fixed_price
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return 0
if last_ledger_entry.licenses_at_next_renewal is None:
return 0
if new_plan is not None:
plan = new_plan
assert(plan.price_per_license is not None) # for mypy
return plan.price_per_license * last_ledger_entry.licenses_at_next_renewal
def get_idempotency_key(ledger_entry: LicenseLedger) -> Optional[str]:
if settings.TEST_SUITE:
return None
return f'ledger_entry:{ledger_entry.id}' # nocoverage
class BillingError(Exception):
# error messages
CONTACT_SUPPORT = _("Something went wrong. Please contact {email}.").format(
email=settings.ZULIP_ADMINISTRATOR,
)
TRY_RELOADING = _("Something went wrong. Please reload the page.")
# description is used only for tests
def __init__(self, description: str, message: str=CONTACT_SUPPORT) -> None:
self.description = description
self.message = message
class StripeCardError(BillingError):
pass
class StripeConnectionError(BillingError):
pass
def catch_stripe_errors(func: CallableT) -> CallableT:
@wraps(func)
def wrapped(*args: object, **kwargs: object) -> object:
if settings.DEVELOPMENT and not settings.TEST_SUITE: # nocoverage
if STRIPE_PUBLISHABLE_KEY is None:
raise BillingError('missing stripe config', "Missing Stripe config. "
"See https://zulip.readthedocs.io/en/latest/subsystems/billing.html.")
try:
return func(*args, **kwargs)
# See https://stripe.com/docs/api/python#error_handling, though
# https://stripe.com/docs/api/ruby#error_handling suggests there are additional fields, and
# https://stripe.com/docs/error-codes gives a more detailed set of error codes
except stripe.error.StripeError as e:
err = e.json_body.get('error', {})
billing_logger.error(
"Stripe error: %s %s %s %s",
e.http_status, err.get('type'), err.get('code'), err.get('param'),
)
if isinstance(e, stripe.error.CardError):
# TODO: Look into i18n for this
raise StripeCardError('card error', err.get('message'))
if isinstance(e, (stripe.error.RateLimitError, stripe.error.APIConnectionError)): # nocoverage TODO
raise StripeConnectionError(
'stripe connection error',
_("Something went wrong. Please wait a few seconds and try again."))
raise BillingError('other stripe error', BillingError.CONTACT_SUPPORT)
return cast(CallableT, wrapped)
@catch_stripe_errors
def stripe_get_customer(stripe_customer_id: str) -> stripe.Customer:
return stripe.Customer.retrieve(stripe_customer_id, expand=["default_source"])
@catch_stripe_errors
def do_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=None) -> Customer:
realm = user.realm
# We could do a better job of handling race conditions here, but if two
# people from a realm try to upgrade at exactly the same time, the main
# bad thing that will happen is that we will create an extra stripe
# customer that we can delete or ignore.
stripe_customer = stripe.Customer.create(
description=f"{realm.string_id} ({realm.name})",
email=user.delivery_email,
metadata={'realm_id': realm.id, 'realm_str': realm.string_id},
source=stripe_token)
event_time = timestamp_to_datetime(stripe_customer.created)
with transaction.atomic():
RealmAuditLog.objects.create(
realm=user.realm, acting_user=user, event_type=RealmAuditLog.STRIPE_CUSTOMER_CREATED,
event_time=event_time)
if stripe_token is not None:
RealmAuditLog.objects.create(
realm=user.realm, acting_user=user, event_type=RealmAuditLog.STRIPE_CARD_CHANGED,
event_time=event_time)
customer, created = Customer.objects.update_or_create(realm=realm, defaults={
'stripe_customer_id': stripe_customer.id})
user.is_billing_admin = True
user.save(update_fields=["is_billing_admin"])
return customer
@catch_stripe_errors
def do_replace_payment_source(user: UserProfile, stripe_token: str,
pay_invoices: bool=False) -> stripe.Customer:
customer = get_customer_by_realm(user.realm)
assert(customer is not None) # for mypy
stripe_customer = stripe_get_customer(customer.stripe_customer_id)
stripe_customer.source = stripe_token
# Deletes existing card: https://stripe.com/docs/api#update_customer-source
updated_stripe_customer = stripe.Customer.save(stripe_customer)
RealmAuditLog.objects.create(
realm=user.realm, acting_user=user, event_type=RealmAuditLog.STRIPE_CARD_CHANGED,
event_time=timezone_now())
if pay_invoices:
for stripe_invoice in stripe.Invoice.list(
billing='charge_automatically', customer=stripe_customer.id, status='open'):
# The user will get either a receipt or a "failed payment" email, but the in-app
# messaging could be clearer here (e.g. it could explicitly tell the user that there
# were payment(s) and that they succeeded or failed).
# Worth fixing if we notice that a lot of cards end up failing at this step.
stripe.Invoice.pay(stripe_invoice)
return updated_stripe_customer
# event_time should roughly be timezone_now(). Not designed to handle
# event_times in the past or future
@transaction.atomic
def make_end_of_cycle_updates_if_needed(plan: CustomerPlan,
event_time: datetime) -> Tuple[Optional[CustomerPlan], Optional[LicenseLedger]]:
last_ledger_entry = LicenseLedger.objects.filter(plan=plan).order_by('-id').first()
last_renewal = LicenseLedger.objects.filter(plan=plan, is_renewal=True) \
.order_by('-id').first().event_time
next_billing_cycle = start_of_next_billing_cycle(plan, last_renewal)
if next_billing_cycle <= event_time:
if plan.status == CustomerPlan.ACTIVE:
return None, LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.FREE_TRIAL:
plan.invoiced_through = last_ledger_entry
assert(plan.next_invoice_date is not None)
plan.billing_cycle_anchor = plan.next_invoice_date.replace(microsecond=0)
plan.status = CustomerPlan.ACTIVE
plan.save(update_fields=["invoiced_through", "billing_cycle_anchor", "status"])
return None, LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.SWITCH_TO_ANNUAL_AT_END_OF_CYCLE:
if plan.fixed_price is not None: # nocoverage
raise NotImplementedError("Can't switch fixed priced monthly plan to annual.")
plan.status = CustomerPlan.ENDED
plan.save(update_fields=["status"])
discount = plan.customer.default_discount or plan.discount
_, _, _, price_per_license = compute_plan_parameters(
automanage_licenses=plan.automanage_licenses, billing_schedule=CustomerPlan.ANNUAL,
discount=plan.discount
)
new_plan = CustomerPlan.objects.create(
customer=plan.customer, billing_schedule=CustomerPlan.ANNUAL, automanage_licenses=plan.automanage_licenses,
charge_automatically=plan.charge_automatically, price_per_license=price_per_license,
discount=discount, billing_cycle_anchor=next_billing_cycle,
tier=plan.tier, status=CustomerPlan.ACTIVE, next_invoice_date=next_billing_cycle,
invoiced_through=None, invoicing_status=CustomerPlan.INITIAL_INVOICE_TO_BE_SENT,
)
new_plan_ledger_entry = LicenseLedger.objects.create(
plan=new_plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal
)
RealmAuditLog.objects.create(
realm=new_plan.customer.realm, event_time=event_time,
event_type=RealmAuditLog.CUSTOMER_SWITCHED_FROM_MONTHLY_TO_ANNUAL_PLAN,
extra_data=orjson.dumps({
"monthly_plan_id": plan.id,
"annual_plan_id": new_plan.id,
}).decode()
)
return new_plan, new_plan_ledger_entry
if plan.status == CustomerPlan.DOWNGRADE_AT_END_OF_CYCLE:
process_downgrade(plan)
return None, None
return None, last_ledger_entry
# Returns Customer instead of stripe_customer so that we don't make a Stripe
# API call if there's nothing to update
def update_or_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=None) -> Customer:
realm = user.realm
customer = get_customer_by_realm(realm)
if customer is None or customer.stripe_customer_id is None:
return do_create_stripe_customer(user, stripe_token=stripe_token)
if stripe_token is not None:
do_replace_payment_source(user, stripe_token)
return customer
def compute_plan_parameters(
automanage_licenses: bool, billing_schedule: int,
discount: Optional[Decimal],
free_trial: bool=False) -> Tuple[datetime, datetime, datetime, int]:
# Everything in Stripe is stored as timestamps with 1 second resolution,
# so standardize on 1 second resolution.
# TODO talk about leapseconds?
billing_cycle_anchor = timezone_now().replace(microsecond=0)
if billing_schedule == CustomerPlan.ANNUAL:
# TODO use variables to account for Zulip Plus
price_per_license = 8000
period_end = add_months(billing_cycle_anchor, 12)
elif billing_schedule == CustomerPlan.MONTHLY:
price_per_license = 800
period_end = add_months(billing_cycle_anchor, 1)
else:
raise AssertionError(f'Unknown billing_schedule: {billing_schedule}')
if discount is not None:
# There are no fractional cents in Stripe, so round down to nearest integer.
price_per_license = int(float(price_per_license * (1 - discount / 100)) + .00001)
next_invoice_date = period_end
if automanage_licenses:
next_invoice_date = add_months(billing_cycle_anchor, 1)
if free_trial:
period_end = billing_cycle_anchor + timedelta(days=settings.FREE_TRIAL_DAYS)
next_invoice_date = period_end
return billing_cycle_anchor, next_invoice_date, period_end, price_per_license
def decimal_to_float(obj: object) -> object:
if isinstance(obj, Decimal):
return float(obj)
raise TypeError # nocoverage
# Only used for cloud signups
@catch_stripe_errors
def process_initial_upgrade(user: UserProfile, licenses: int, automanage_licenses: bool,
billing_schedule: int, stripe_token: Optional[str]) -> None:
realm = user.realm
customer = update_or_create_stripe_customer(user, stripe_token=stripe_token)
charge_automatically = stripe_token is not None
free_trial = settings.FREE_TRIAL_DAYS not in (None, 0)
if get_current_plan_by_customer(customer) is not None:
# Unlikely race condition from two people upgrading (clicking "Make payment")
# at exactly the same time. Doesn't fully resolve the race condition, but having
# a check here reduces the likelihood.
billing_logger.warning(
"Customer %s trying to upgrade, but has an active subscription", customer,
)
raise BillingError('subscribing with existing subscription', BillingError.TRY_RELOADING)
billing_cycle_anchor, next_invoice_date, period_end, price_per_license = compute_plan_parameters(
automanage_licenses, billing_schedule, customer.default_discount, free_trial)
# The main design constraint in this function is that if you upgrade with a credit card, and the
# charge fails, everything should be rolled back as if nothing had happened. This is because we
# expect frequent card failures on initial signup.
# Hence, if we're going to charge a card, do it at the beginning, even if we later may have to
# adjust the number of licenses.
if charge_automatically:
if not free_trial:
stripe_charge = stripe.Charge.create(
amount=price_per_license * licenses,
currency='usd',
customer=customer.stripe_customer_id,
description=f"Upgrade to Zulip Standard, ${price_per_license/100} x {licenses}",
receipt_email=user.delivery_email,
statement_descriptor='Zulip Standard')
# Not setting a period start and end, but maybe we should? Unclear what will make things
# most similar to the renewal case from an accounting perspective.
assert isinstance(stripe_charge.source, stripe.Card)
description = f"Payment (Card ending in {stripe_charge.source.last4})"
stripe.InvoiceItem.create(
amount=price_per_license * licenses * -1,
currency='usd',
customer=customer.stripe_customer_id,
description=description,
discountable=False)
# TODO: The correctness of this relies on user creation, deactivation, etc being
# in a transaction.atomic() with the relevant RealmAuditLog entries
with transaction.atomic():
# billed_licenses can greater than licenses if users are added between the start of
# this function (process_initial_upgrade) and now
billed_licenses = max(get_latest_seat_count(realm), licenses)
plan_params = {
'automanage_licenses': automanage_licenses,
'charge_automatically': charge_automatically,
'price_per_license': price_per_license,
'discount': customer.default_discount,
'billing_cycle_anchor': billing_cycle_anchor,
'billing_schedule': billing_schedule,
'tier': CustomerPlan.STANDARD}
if free_trial:
plan_params['status'] = CustomerPlan.FREE_TRIAL
plan = CustomerPlan.objects.create(
customer=customer,
next_invoice_date=next_invoice_date,
**plan_params)
ledger_entry = LicenseLedger.objects.create(
plan=plan,
is_renewal=True,
event_time=billing_cycle_anchor,
licenses=billed_licenses,
licenses_at_next_renewal=billed_licenses)
plan.invoiced_through = ledger_entry
plan.save(update_fields=['invoiced_through'])
RealmAuditLog.objects.create(
realm=realm, acting_user=user, event_time=billing_cycle_anchor,
event_type=RealmAuditLog.CUSTOMER_PLAN_CREATED,
extra_data=orjson.dumps(plan_params, default=decimal_to_float).decode())
if not free_trial:
stripe.InvoiceItem.create(
currency='usd',
customer=customer.stripe_customer_id,
description='Zulip Standard',
discountable=False,
period = {'start': datetime_to_timestamp(billing_cycle_anchor),
'end': datetime_to_timestamp(period_end)},
quantity=billed_licenses,
unit_amount=price_per_license)
if charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
from zerver.lib.actions import do_change_plan_type
do_change_plan_type(realm, Realm.STANDARD)
def update_license_ledger_for_automanaged_plan(realm: Realm, plan: CustomerPlan,
event_time: datetime) -> None:
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return
if new_plan is not None:
plan = new_plan
licenses_at_next_renewal = get_latest_seat_count(realm)
licenses = max(licenses_at_next_renewal, last_ledger_entry.licenses)
LicenseLedger.objects.create(
plan=plan, event_time=event_time, licenses=licenses,
licenses_at_next_renewal=licenses_at_next_renewal)
def update_license_ledger_if_needed(realm: Realm, event_time: datetime) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None:
return
if not plan.automanage_licenses:
return
update_license_ledger_for_automanaged_plan(realm, plan, event_time)
def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
if plan.invoicing_status == CustomerPlan.STARTED:
raise NotImplementedError('Plan with invoicing_status==STARTED needs manual resolution.')
make_end_of_cycle_updates_if_needed(plan, event_time)
if plan.invoicing_status == CustomerPlan.INITIAL_INVOICE_TO_BE_SENT:
invoiced_through_id = -1
licenses_base = None
else:
assert(plan.invoiced_through is not None)
licenses_base = plan.invoiced_through.licenses
invoiced_through_id = plan.invoiced_through.id
invoice_item_created = False
for ledger_entry in LicenseLedger.objects.filter(plan=plan, id__gt=invoiced_through_id,
event_time__lte=event_time).order_by('id'):
price_args: Dict[str, int] = {}
if ledger_entry.is_renewal:
if plan.fixed_price is not None:
price_args = {'amount': plan.fixed_price}
else:
assert(plan.price_per_license is not None) # needed for mypy
price_args = {'unit_amount': plan.price_per_license,
'quantity': ledger_entry.licenses}
description = "Zulip Standard - renewal"
elif licenses_base is not None and ledger_entry.licenses != licenses_base:
assert(plan.price_per_license)
last_renewal = LicenseLedger.objects.filter(
plan=plan, is_renewal=True, event_time__lte=ledger_entry.event_time) \
.order_by('-id').first().event_time
period_end = start_of_next_billing_cycle(plan, ledger_entry.event_time)
proration_fraction = (period_end - ledger_entry.event_time) / (period_end - last_renewal)
price_args = {'unit_amount': int(plan.price_per_license * proration_fraction + .5),
'quantity': ledger_entry.licenses - licenses_base}
description = "Additional license ({} - {})".format(
ledger_entry.event_time.strftime('%b %-d, %Y'), period_end.strftime('%b %-d, %Y'))
if price_args:
plan.invoiced_through = ledger_entry
plan.invoicing_status = CustomerPlan.STARTED
plan.save(update_fields=['invoicing_status', 'invoiced_through'])
stripe.InvoiceItem.create(
currency='usd',
customer=plan.customer.stripe_customer_id,
description=description,
discountable=False,
period = {'start': datetime_to_timestamp(ledger_entry.event_time),
'end': datetime_to_timestamp(
start_of_next_billing_cycle(plan, ledger_entry.event_time))},
idempotency_key=get_idempotency_key(ledger_entry),
**price_args)
invoice_item_created = True
plan.invoiced_through = ledger_entry
plan.invoicing_status = CustomerPlan.DONE
plan.save(update_fields=['invoicing_status', 'invoiced_through'])
licenses_base = ledger_entry.licenses
if invoice_item_created:
if plan.charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=plan.customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
plan.next_invoice_date = next_invoice_date(plan)
plan.save(update_fields=['next_invoice_date'])
def invoice_plans_as_needed(event_time: datetime=timezone_now()) -> None:
for plan in CustomerPlan.objects.filter(next_invoice_date__lte=event_time):
invoice_plan(plan, event_time)
def attach_discount_to_realm(realm: Realm, discount: Decimal) -> None:
Customer.objects.update_or_create(realm=realm, defaults={'default_discount': discount})
def update_sponsorship_status(realm: Realm, sponsorship_pending: bool) -> None:
customer, _ = Customer.objects.get_or_create(realm=realm)
customer.sponsorship_pending = sponsorship_pending
customer.save(update_fields=["sponsorship_pending"])
def approve_sponsorship(realm: Realm) -> None:
from zerver.lib.actions import do_change_plan_type, internal_send_private_message
do_change_plan_type(realm, Realm.STANDARD_FREE)
customer = get_customer_by_realm(realm)
if customer is not None and customer.sponsorship_pending:
customer.sponsorship_pending = False
customer.save(update_fields=["sponsorship_pending"])
notification_bot = get_system_bot(settings.NOTIFICATION_BOT)
for billing_admin in realm.get_human_billing_admin_users():
with override_language(billing_admin.default_language):
# Using variable to make life easier for translators if these details change.
plan_name = "Zulip Cloud Standard"
emoji = ":tada:"
message = _(
f"Your organization's request for sponsored hosting has been approved! {emoji}.\n"
f"You have been upgraded to {plan_name}, free of charge.")
internal_send_private_message(billing_admin.realm, notification_bot, billing_admin, message)
def get_discount_for_realm(realm: Realm) -> Optional[Decimal]:
customer = get_customer_by_realm(realm)
if customer is not None:
return customer.default_discount
return None
def do_change_plan_status(plan: CustomerPlan, status: int) -> None:
plan.status = status
plan.save(update_fields=['status'])
billing_logger.info(
'Change plan status: Customer.id: %s, CustomerPlan.id: %s, status: %s',
plan.customer.id, plan.id, status,
)
def process_downgrade(plan: CustomerPlan) -> None:
from zerver.lib.actions import do_change_plan_type
do_change_plan_type(plan.customer.realm, Realm.LIMITED)
plan.status = CustomerPlan.ENDED
plan.save(update_fields=['status'])
def estimate_annual_recurring_revenue_by_realm() -> Dict[str, int]: # nocoverage
annual_revenue = {}
for plan in CustomerPlan.objects.filter(
status=CustomerPlan.ACTIVE).select_related('customer__realm'):
# TODO: figure out what to do for plans that don't automatically
# renew, but which probably will renew
renewal_cents = renewal_amount(plan, timezone_now())
if plan.billing_schedule == CustomerPlan.MONTHLY:
renewal_cents *= 12
# TODO: Decimal stuff
annual_revenue[plan.customer.realm.string_id] = int(renewal_cents / 100)
return annual_revenue
# During realm deactivation we instantly downgrade the plan to Limited.
# Extra users added in the final month are not charged. Also used
# for the cancellation of Free Trial.
def downgrade_now(realm: Realm) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None:
return
process_downgrade(plan)
plan.invoiced_through = LicenseLedger.objects.filter(plan=plan).order_by('id').last()
plan.next_invoice_date = next_invoice_date(plan)
plan.save(update_fields=["invoiced_through", "next_invoice_date"])

View File

@@ -1,51 +0,0 @@
# Generated by Django 1.11.14 on 2018-09-25 12:02
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
('zerver', '0189_userprofile_add_some_emojisets'),
]
operations = [
migrations.CreateModel(
name='BillingProcessor',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('state', models.CharField(max_length=20)),
('last_modified', models.DateTimeField(auto_now=True)),
('log_row', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='zerver.RealmAuditLog')),
('realm', models.OneToOneField(null=True, on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
],
),
migrations.CreateModel(
name='Coupon',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('percent_off', models.SmallIntegerField(unique=True)),
('stripe_coupon_id', models.CharField(max_length=255, unique=True)),
],
),
migrations.CreateModel(
name='Customer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('stripe_customer_id', models.CharField(max_length=255, unique=True)),
('has_billing_relationship', models.BooleanField(default=False)),
('realm', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to='zerver.Realm')),
],
),
migrations.CreateModel(
name='Plan',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('nickname', models.CharField(max_length=40, unique=True)),
('stripe_plan_id', models.CharField(max_length=255, unique=True)),
],
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.16 on 2018-12-12 20:19
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='customer',
name='default_discount',
field=models.DecimalField(decimal_places=4, max_digits=7, null=True),
),
]

View File

@@ -1,33 +0,0 @@
# Generated by Django 1.11.16 on 2018-12-22 21:05
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0002_customer_default_discount'),
]
operations = [
migrations.CreateModel(
name='CustomerPlan',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('licenses', models.IntegerField()),
('automanage_licenses', models.BooleanField(default=False)),
('charge_automatically', models.BooleanField(default=False)),
('price_per_license', models.IntegerField(null=True)),
('fixed_price', models.IntegerField(null=True)),
('discount', models.DecimalField(decimal_places=4, max_digits=6, null=True)),
('billing_cycle_anchor', models.DateTimeField()),
('billing_schedule', models.SmallIntegerField()),
('billed_through', models.DateTimeField()),
('next_billing_date', models.DateTimeField(db_index=True)),
('tier', models.SmallIntegerField()),
('status', models.SmallIntegerField(default=1)),
('customer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.Customer')),
],
),
]

View File

@@ -1,25 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-19 05:01
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0003_customerplan'),
]
operations = [
migrations.CreateModel(
name='LicenseLedger',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('is_renewal', models.BooleanField(default=False)),
('event_time', models.DateTimeField()),
('licenses', models.IntegerField()),
('licenses_at_next_renewal', models.IntegerField(null=True)),
('plan', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.CustomerPlan')),
],
),
]

View File

@@ -1,33 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-28 13:04
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0004_licenseledger'),
]
operations = [
migrations.RenameField(
model_name='customerplan',
old_name='next_billing_date',
new_name='next_invoice_date',
),
migrations.RemoveField(
model_name='customerplan',
name='billed_through',
),
migrations.AddField(
model_name='customerplan',
name='invoiced_through',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='+', to='corporate.LicenseLedger'),
),
migrations.AddField(
model_name='customerplan',
name='invoicing_status',
field=models.SmallIntegerField(default=1),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-29 01:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0005_customerplan_invoicing'),
]
operations = [
migrations.AlterField(
model_name='customer',
name='stripe_customer_id',
field=models.CharField(max_length=255, null=True, unique=True),
),
]

View File

@@ -1,38 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-31 22:16
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('corporate', '0006_nullable_stripe_customer_id'),
]
operations = [
migrations.RemoveField(
model_name='billingprocessor',
name='log_row',
),
migrations.RemoveField(
model_name='billingprocessor',
name='realm',
),
migrations.DeleteModel(
name='Coupon',
),
migrations.DeleteModel(
name='Plan',
),
migrations.RemoveField(
model_name='customer',
name='has_billing_relationship',
),
migrations.RemoveField(
model_name='customerplan',
name='licenses',
),
migrations.DeleteModel(
name='BillingProcessor',
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.20 on 2019-04-11 00:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0007_remove_deprecated_fields'),
]
operations = [
migrations.AlterField(
model_name='customerplan',
name='next_invoice_date',
field=models.DateTimeField(db_index=True, null=True),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 2.2.13 on 2020-06-09 12:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0008_nullable_next_invoice_date'),
]
operations = [
migrations.AddField(
model_name='customer',
name='sponsorship_pending',
field=models.BooleanField(default=False),
),
]

View File

@@ -1,104 +0,0 @@
import datetime
from decimal import Decimal
from typing import Optional
from django.db import models
from django.db.models import CASCADE
from zerver.models import Realm
class Customer(models.Model):
realm: Realm = models.OneToOneField(Realm, on_delete=CASCADE)
stripe_customer_id: str = models.CharField(max_length=255, null=True, unique=True)
sponsorship_pending: bool = models.BooleanField(default=False)
# A percentage, like 85.
default_discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=7, null=True)
def __str__(self) -> str:
return f"<Customer {self.realm} {self.stripe_customer_id}>"
def get_customer_by_realm(realm: Realm) -> Optional[Customer]:
return Customer.objects.filter(realm=realm).first()
class CustomerPlan(models.Model):
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
automanage_licenses: bool = models.BooleanField(default=False)
charge_automatically: bool = models.BooleanField(default=False)
# Both of these are in cents. Exactly one of price_per_license or
# fixed_price should be set. fixed_price is only for manual deals, and
# can't be set via the self-serve billing system.
price_per_license: Optional[int] = models.IntegerField(null=True)
fixed_price: Optional[int] = models.IntegerField(null=True)
# Discount that was applied. For display purposes only.
discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=6, null=True)
billing_cycle_anchor: datetime.datetime = models.DateTimeField()
ANNUAL = 1
MONTHLY = 2
billing_schedule: int = models.SmallIntegerField()
next_invoice_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True)
invoiced_through: Optional["LicenseLedger"] = models.ForeignKey(
'LicenseLedger', null=True, on_delete=CASCADE, related_name='+')
DONE = 1
STARTED = 2
INITIAL_INVOICE_TO_BE_SENT = 3
invoicing_status: int = models.SmallIntegerField(default=DONE)
STANDARD = 1
PLUS = 2 # not available through self-serve signup
ENTERPRISE = 10
tier: int = models.SmallIntegerField()
ACTIVE = 1
DOWNGRADE_AT_END_OF_CYCLE = 2
FREE_TRIAL = 3
SWITCH_TO_ANNUAL_AT_END_OF_CYCLE = 4
# "Live" plans should have a value < LIVE_STATUS_THRESHOLD.
# There should be at most one live plan per customer.
LIVE_STATUS_THRESHOLD = 10
ENDED = 11
NEVER_STARTED = 12
status: int = models.SmallIntegerField(default=ACTIVE)
# TODO maybe override setattr to ensure billing_cycle_anchor, etc are immutable
@property
def name(self) -> str:
return {
CustomerPlan.STANDARD: 'Zulip Standard',
CustomerPlan.PLUS: 'Zulip Plus',
CustomerPlan.ENTERPRISE: 'Zulip Enterprise',
}[self.tier]
def get_plan_status_as_text(self) -> str:
return {
self.ACTIVE: "Active",
self.DOWNGRADE_AT_END_OF_CYCLE: "Scheduled for downgrade at end of cycle",
self.FREE_TRIAL: "Free trial",
self.ENDED: "Ended",
self.NEVER_STARTED: "Never started"
}[self.status]
def get_current_plan_by_customer(customer: Customer) -> Optional[CustomerPlan]:
return CustomerPlan.objects.filter(
customer=customer, status__lt=CustomerPlan.LIVE_STATUS_THRESHOLD).first()
def get_current_plan_by_realm(realm: Realm) -> Optional[CustomerPlan]:
customer = get_customer_by_realm(realm)
if customer is None:
return None
return get_current_plan_by_customer(customer)
class LicenseLedger(models.Model):
plan: CustomerPlan = models.ForeignKey(CustomerPlan, on_delete=CASCADE)
# Also True for the initial upgrade.
is_renewal: bool = models.BooleanField(default=False)
event_time: datetime.datetime = models.DateTimeField()
licenses: int = models.IntegerField()
# None means the plan does not automatically renew.
# This cannot be None if plan.automanage_licenses.
licenses_at_next_renewal: Optional[int] = models.IntegerField(null=True)

View File

@@ -1,112 +0,0 @@
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 0,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/charges/ch_NORMALIZED00000000000001/refunds"
},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}

View File

@@ -1,112 +0,0 @@
{
"amount": 36000,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000002",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 0,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
"refunded": false,
"refunds": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/charges/ch_NORMALIZED00000000000002/refunds"
},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}

View File

@@ -1,113 +0,0 @@
{
"data": [
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 0,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}
],
"has_more": false,
"object": "list",
"url": "/v1/charges"
}

View File

@@ -1,219 +0,0 @@
{
"data": [
{
"amount": 36000,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000002",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 0,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
},
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 0,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}
],
"has_more": false,
"object": "list",
"url": "/v1/charges"
}

View File

@@ -1,79 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": null,
"default_source": "card_NORMALIZED00000000000001",
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,103 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": "usd",
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,79 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": "usd",
"default_source": "card_NORMALIZED00000000000002",
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 36000,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -36000,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000004",
"invoice_item": "ii_NORMALIZED00000000000004",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001/pdf",
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0001",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002/pdf",
"lines": {
"data": [
{
"amount": 36000,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -36000,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000004",
"invoice_item": "ii_NORMALIZED00000000000004",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0002",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

Some files were not shown because too many files have changed in this diff Show More