Compare commits

..

53 Commits

Author SHA1 Message Date
Tim Abbott
8e7ac21fe0 Release Zulip Server 1.9.2. 2019-01-29 16:33:36 -08:00
Tim Abbott
cbfae3e0d0 import: Fix uploading avatars with S3 upload backend.
This should hopefully be the last commit of this form; ultimately, my
hope is that we'll be able to refactor the semi-duplicated logic in
this file to avoid so much effort going into keeping this correct.
2019-01-29 16:26:19 -08:00
Tim Abbott
ffeb4340a9 auth: Migrate Google authentication off deprecated name API.
As part of Google+ being removed, they've eliminated support for the
/plus/v1/people/me endpoint.  Replace it with the very similar
/oauth2/v3/userinfo endpoint.
2019-01-29 16:17:27 -08:00
Matthew Wegner
79f781b9ea import: Normalize Slackbot String Comparison.
In very old Slack workspaces, slackbot can appear as "Slackbot", and
the import script only checks for "slackbot" (case sensitive).  This
breaks the import--it throws the assert that immediately follows the
test.  I don't know how common this is, but it definitely affected our
import.

The simple fix is to compare against a lowercased-version of the
user's full name.
2019-01-29 16:14:10 -08:00
Tim Abbott
fd89df63b4 hipchat: Fix importing of private messages.
Apparently a stupid typing issue meant that we broke this a few weeks
ago.
2019-01-29 16:13:25 -08:00
Tim Abbott
509d335705 import: Handle corner case around EMAIL_GATEWAY_BOT emails. 2019-01-29 16:12:01 -08:00
Tim Abbott
ce28ccf2bf import: Fix pointer logic for zulip->zulip imports.
Previously, the pointer was almost guaranteed to be an invalid random
value, because we renumber message IDs unconditionally now.
2019-01-29 16:11:56 -08:00
Tim Abbott
4adbeedef6 hipchat: Handle unusual emoticons.json format.
Apparently, hc-migrate can generate emoticons.json files with a
somewhat different format.  Assuming that other files are in the
normal format, we should be able to handle it like this.

See report in #11135.
2019-01-29 16:11:34 -08:00
Tim Abbott
09ed7d5b77 hipchat: Handle case where emoticons.json is not in export.
Apparently, some methods of exporting from HipChat do not include an
emoticons.json file.  We could test for this using the
`include_emoticons` field in `metadata.json`, but we currently don't
even bother to read that file.  Rather than changing that, we just
print a warning and proceed.  This is arguably better anyway, in that
often not having emoticons.json is the result of user error when
exporting, and it's nice to flag that this is happening.

Fixes #11135.
2019-01-29 16:11:30 -08:00
Tim Abbott
f445d3f589 import: Ensure presence of basic avatar images for HipChat.
Our HipChat conversion tool didn't properly handle basic avatar
images, resulting in only the medium-size avatar images being imported
properly.  This fixes that bug by asking the import tool to do the
thumbnailing for the basic avatar image (from the .original file) as
well as the medium avatar image.
2019-01-29 16:11:23 -08:00
Tim Abbott
e8ee374d4f slack import: Import long-inactive users as long-term idle.
This avoids creating UserMessage rows for long-inactive users in
organizations with many thousands of users.
2019-01-29 16:10:59 -08:00
Tim Abbott
9ff5359522 export: Remove assertion on current working directory.
This command hasn't made deep assumptions about CWD for a long time,
and this enables users to run it through a symlink (etc.).

Fixes #10961.
2019-01-29 16:10:26 -08:00
Tim Abbott
a31f56443a import: Avoid unnecessary forks when downloading attachments.
The previous implementation used run_parallel incorrectly, passing it
a set of very small jobs (each was to download a single file), which
meant that we'd end up forking once for every file to download.

This correct implementation sends each of N threads 1/N of the files
to download, which is more consistent with the goal of distributing
the download work between N threads.
2019-01-29 16:09:42 -08:00
rht
0b263d8b8c slack import: Eliminate need to load all messages into memory.
This works by yielding messages sorted based on timestamp.  Because
the Slack exports are broken into files by date, it's convenient to do
a 2-layer sorting process, where we open all the files for a given
day, and then sort their messages by timestamp before yielding them.

Fixes #10930.
2019-01-29 16:09:37 -08:00
Tim Abbott
ad00b02c66 slack import: Fix all messages being imported to one channel.
This was an ugly variable-escape-from-loop regression introduced in
e59ff6e6db.
2019-01-29 16:09:06 -08:00
Tim Abbott
02f2ae4048 slack import: Fix empty values for custom profile fields.
The Slack import process would incorrectly issue
CustomProfileFieldValue entries with a value of "" for users who
didn't have a given CustomProfileField (especially common for the
"skype" and "phone" fields).  This had no user-visible effect, but
certainly added some clutter in the database.
2019-01-29 16:09:02 -08:00
Tim Abbott
56d4426738 gitter: Do something reasonable with invalid fullnames. 2019-01-29 16:08:55 -08:00
Tim Abbott
f0fe7d3887 scripts: Recommend apt update after enabling universe.
One needs to manually do an apt update after add-apt-repository, or it
won't actually work.
2019-01-29 16:07:31 -08:00
Sumanth V Rao
21166fbdf9 upgrade-zulip-stage-2: Added argument to skip purging old deployments.
This makes it possible to add --skip-purge-old-deployments in the
deploy_options section of /etc/zulip/zulip.conf, and control whether
old deployments are purged automatically on a system.

We still need to do https://github.com/zulip/zulip/issues/10534 and
probably also to add these arguments to be directly passed into
upgrade-zulip, but that can wait for future work.

Fixes #10946.
2019-01-29 16:06:08 -08:00
Tim Abbott
b2c865aab5 scripts: Fix incorrect garbage-collection of emoji/node caches.
Apparently, we were incorrectly expressing the paths in the
caches_in_use data structures for these two cache-cleaning algorithms,
resulting in the default threshhold_days algorithm controlling which
caches could be garbage-collected.  While the emoji one was just a
performance optimization for upgrade-zulip-from-git, it was possible
for the main `node_modules` cache in use in production to be GCed,
resulting in LaTeX rendering being broken.
2019-01-29 16:05:32 -08:00
Tim Abbott
fd9847ffcb Release Zulip Server 1.9.1. 2018-11-30 13:10:54 -08:00
Tim Abbott
2bca1d4ef0 docs: Move generic reverse proxy notes further down. 2018-11-30 13:08:34 -08:00
Bruce
871fdddf86 docs: Document how to use Zulip behind an haproxy reverse proxy.
With significant rearrangement by tabbott to have more common text
between different proxy implementations.
2018-11-30 13:08:34 -08:00
Igor Posledov
bf4e01eb02 docs: Add nginx reverse proxy basic config example. 2018-11-30 13:08:34 -08:00
Tim Abbott
92ea9a0729 show_admins: Rewrite to use management library.
This makes this command more standardized, and helps avoid future bugs
like the one fixed in the last commit.
2018-11-30 13:08:34 -08:00
Tim Abbott
a36cb9b247 show_admins: Fix buggy realm parsing. 2018-11-30 13:08:34 -08:00
Rohitt Vashishtha
da71f67f85 scripts: Cleanly exit manage.py when run with python2.
Fixes #10854.
2018-11-30 13:08:34 -08:00
Anders Kaseorg
5518abe1d6 install: Check whether universe repository is enabled on Ubuntu.
Fixes #10417.

Signed-off-by: Anders Kaseorg <andersk@mit.edu>
2018-11-30 13:08:33 -08:00
Rohitt Vashishtha
b34848447e scripts: Make upgrade-zulip-* use root checking from zulip_tools.
This is mostly just a nice code deduplication/cleanup.
2018-11-30 13:08:33 -08:00
Rohitt Vashishtha
3d5c994a32 scripts: Make manage.py use root checking from zulip_tools. 2018-11-30 13:08:33 -08:00
Rohitt Vashishtha
cfa2c6bf37 scripts: Make zulip-puppet-apply check if the user is root.
Fixes #10833.
2018-11-30 13:08:33 -08:00
Rohitt Vashishtha
3d243a457f scripts: Add util functions for checking root to zulip_tools. 2018-11-30 13:08:33 -08:00
Tim Abbott
e414036eb3 docs: Fix missing quotes in su zulip -c documentation.
This fixes an actual user-facing issue in our mobile push
notifications documentation (where we were incorrectly failing to
quote the argument to `./manage.py register_server` making it not
work), as well as preventing future similar issues from occurring
again via a linter rule.
2018-11-30 13:08:33 -08:00
Tim Abbott
9ec154bbe8 docs: Document how to setup system postfix email with Zulip. 2018-11-30 13:08:33 -08:00
Tim Abbott
da479c3613 setup-apt-repo: Install gnupg as part of installation.
Apparently, on Debian stretch, the gnupg package isn't installed by
default, which means that our `apt-key add` commands were failing with
these errors on an ultra-minimal Debian installation:

+ apt-key add ./scripts/setup/packagecloud.asc
E: gnupg, gnupg2 and gnupg1 do not seem to be installed, but one of them is required for this operation
+ apt-key add ./scripts/setup/pgroonga-debian.asc
E: gnupg, gnupg2 and gnupg1 do not seem to be installed, but one of them is required for this operation

Fixes #10480.
2018-11-30 13:08:33 -08:00
Tim Abbott
23d8f6e6b0 i18n: Update translation data from transifex. 2018-11-28 12:48:53 -08:00
Tim Abbott
d929ad0122 upload: Fix ensure_medium_avatar_image for S3 backend.
Previously, it tried to interact with the wrong path for the original
image.
2018-11-28 12:30:24 -08:00
Tim Abbott
429d0f9728 push notifications: Improve logging for missing configuration.
While it could make sense to print these logging statements at WARN
level on server startup, it doesn't make sense to do so on every
message (though it perhaps did make sense to do so before more recent
changes added good ways to discover you forgot to configure push
notifications).

Instead, we now just do a WARN log on queue processor startup, and
then at DEBUG level for individual messages.

Fixes #10894.
2018-11-28 12:30:24 -08:00
Tim Abbott
c905915055 push notifications: Fix a comment typo. 2018-11-28 12:30:24 -08:00
Tim Abbott
b238d0e75e docs: Fix some broken links in security model doc.
Apparently, we haven't been running test-documentation in production
of late.
2018-11-28 12:30:23 -08:00
Tim Abbott
a4ebdac521 docs: Document how to use AWS SIGv4 with boto.
This is required in some AWS regions.

The right long-term fix is to move to boto3 which doesn't have this
problem.

Allows us to downgrade the priority of #9376.
2018-11-28 12:27:18 -08:00
Tim Abbott
bdd6e1fabe docs: Fix accidental repeat bullet #1 in S3 backend documentation.
Due to missing indentation, the numbering was resetting to 1 rather
than continuing to 6.
2018-11-28 12:27:15 -08:00
Vishnu Ks
d120ee25b4 auth: Always force Google to show account chooser.
Fixes #10515
2018-11-16 11:57:12 -08:00
Tim Abbott
14938fbf4c nginx: Fix missing API authentication configuration.
This fixes a bug where our API routes for uploaded files (where we
need to use a consistent URL between session auth and API auth) were
not properly configured to pass through the API authentication headers
(and otherwise provide REST endpoint settings).

In particular, this prevented the Zulip mobile apps from being able to
access authenticated image files using these URLs.
2018-11-16 11:57:05 -08:00
Tim Abbott
8babe17f8f docs: Clarify preparatory process for data import.
You need a Zulip server running the a matching version, and no longer
need to do an upgrade from master before using established import tools.
2018-11-14 17:15:02 -08:00
Tim Abbott
724e1d3002 docs: Link to setup-certbot multiple hostname support. 2018-11-14 13:10:05 -08:00
Rohitt Vashishtha
a474a08195 setup-cerbot: Allow issuing certificates for multiple domains.
This commit allows specifying Subject Alternative Names to issue certs
for multiple domains using certbot. The first name passed to certbot-auto
becomes the common name for the certificate; common name and the other
names are then added to the SAN field. All of these arguments are now
positional. Also read the following for the certbot syntax reference:

https://community.letsencrypt.org/t/how-to-specify-subject-name-on-san/

Fixes #10674.
2018-11-14 13:10:02 -08:00
Tim Abbott
a48bbef766 docs: Clarify push registration for running manage.py correctly.
We've had several users get errors running this because they ran it as
a bash script; fix this my making the command super explicit.
2018-11-14 13:09:58 -08:00
Tim Abbott
71c5632e07 install: Provide a suggestive error message when missing Universe.
By far the dominant cause of errors when installing apt packages is
not having the Universe repository enabled in Ubuntu bionic (this
seems to have started happening a lot recently; I wonder if Ubuntu
changed the defaults for new server installs or something?).

In any case, providing that suggestion in the error output should help
reduce these a lot.
2018-11-12 10:57:07 -08:00
Tim Abbott
498b6e4670 install: Improve some error output for common errors.
This uses `set +x` to hide the `echo` output, and then sets the font
color to red.
2018-11-12 10:57:04 -08:00
Tim Abbott
925ddc28f4 import: Don't assume a last_modified key is present.
This fixes an exception when importing uploaded file data from
Slack/HipChat.
2018-11-08 15:29:49 -08:00
Tim Abbott
3651b8d254 import: Fix buggy handling of avatars in Slack conversion.
This was a pretty nasty error, where we were accidentally accessing
the parent list in this inner loop function.

This appears to have been introduced as a refactoring bug in
7822ef38c2.
2018-11-08 15:29:25 -08:00
Tim Abbott
90d3cbceed docs: Further document tokenized noreply email addresses.
We should still extend email.html to explain the security issue a bit
more clearly, since the article we link to is super long.
2018-11-08 15:29:22 -08:00
4140 changed files with 284359 additions and 467233 deletions

View File

@@ -1,6 +0,0 @@
> 0.2%
> 0.2% in US
last 2 versions
Firefox ESR
not dead
Chrome 26 # similar to PhantomJS

View File

@@ -1,383 +1,205 @@
# See https://zulip.readthedocs.io/en/latest/testing/continuous-integration.html for
# high-level documentation on our CircleCI setup.
# See CircleCI upstream's docs on this config format:
# https://circleci.com/docs/2.0/language-python/
#
version: 2.0
aliases:
- &create_cache_directories
run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- &restore_cache_package_json
restore_cache:
keys:
- v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &restore_cache_requirements
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &restore_emoji_cache
restore_cache:
keys:
- v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- &install_dependencies
run:
name: install dependencies
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
- &save_cache_package_json
save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- &save_cache_requirements
save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "requirements/thumbor-dev.txt" }}-{{ checksum "requirements/dev.txt" }}
- &save_emoji_cache
save_cache:
paths:
- /srv/zulip-emoji-cache
key: v1-venv-base.{{ .Environment.CIRCLE_JOB }}-{{ checksum "tools/setup/emoji/emoji_map.json" }}-{{ checksum "tools/setup/emoji/build_emoji" }}-{{checksum "tools/setup/emoji/emoji_setup_utils.py" }}-{{ checksum "tools/setup/emoji/emoji_names.py" }}-{{ checksum "package.json" }}
- &do_bionic_hack
run:
name: do Bionic hack
command: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- &run_backend_tests
run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/backend 2>&1" ts
- &run_frontend_tests
run:
name: run frontend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
- &upload_coverage_report
run:
name: upload coverage report
command: |
# codecov requires `.coverage` file to be stored in pwd for
# uploading coverage results.
mv /home/circleci/zulip/var/.coverage /home/circleci/zulip/.coverage
. /srv/zulip-py3-venv/bin/activate
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
- &build_production
run:
name: build production
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "./tools/ci/production-build 2>&1" ts
- &production_extract_tarball
run:
name: production extract tarball
command: |
sudo apt-get update
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
mispipe "/tmp/production-extract-tarball 2>&1" ts
- &install_production
run:
name: install production
command: |
sudo service rabbitmq-server restart
sudo mispipe "/tmp/production-install 2>&1" ts
- &verify_production
run:
name: verify install
command: |
sudo mispipe "/tmp/production-verify 2>&1" ts
- &upgrade_postgresql
run:
name: upgrade postgresql
command: |
sudo mispipe "/tmp/production-upgrade-pg 2>&1" ts
- &check_xenial_provision_error
run:
name: check tools/provision error message on xenial
command: |
! tools/provision > >(tee provision.out)
grep -Fqx 'CRITICAL:root:Unsupported platform: ubuntu 16.04' provision.out
- &check_xenial_upgrade_error
run:
name: check scripts/lib/upgrade-zulip-stage-2 error message on xenial
command: |
! sudo scripts/lib/upgrade-zulip-stage-2 2> >(tee upgrade.err >&2)
grep -Fq 'upgrade-zulip-stage-2: Unsupported platform: ubuntu 16.04' upgrade.err
- &notify_failure_status
run:
name: On fail
when: on_fail
branches:
only: master
command: |
if [[ "$CIRCLE_REPOSITORY_URL" == "git@github.com:zulip/zulip.git" && "$ZULIP_BOT_KEY" != "" ]]; then
curl -H "Content-Type: application/json" \
-X POST -i 'https://chat.zulip.org/api/v1/external/circleci?api_key='"$ZULIP_BOT_KEY"'&stream=automated%20testing&topic=master%20failing' \
-d '{"payload": { "branch": "'"$CIRCLE_BRANCH"'", "reponame": "'"$CIRCLE_PROJECT_REPONAME"'", "status": "failed", "build_url": "'"$CIRCLE_BUILD_URL"'", "username": "'"$CIRCLE_USERNAME"'"}}'
fi
version: 2
jobs:
"bionic-backend-frontend":
"trusty-python-3.4":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
# This is built from tools/circleci/images/trusty/Dockerfile .
- image: gregprice/circleci:trusty-python-5.test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *do_bionic_hack
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
- *run_backend_tests
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- restore_cache:
keys:
- v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: test locked requirements
name: install dependencies
command: |
# Install moreutils so we can use `ts` and `mispipe` in the following.
sudo apt-get install -y moreutils
# CircleCI sets the following in Git config at clone time:
# url.ssh://git@github.com.insteadOf https://github.com
# This breaks the Git clones in the NVM `install.sh` we run
# in `install-node`.
# TODO: figure out why that breaks, and whether we want it.
# (Is it an optimization?)
rm -f /home/circleci/.gitconfig
# This is the main setup job for the test suite
mispipe "tools/travis/setup-backend" ts
# Cleaning caches is mostly unnecessary in Circle, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.trusty-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.trusty-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
# TODO: in Travis we also cache ~/zulip-emoji-cache, ~/node, ~/misc
# The moment of truth! Run the tests.
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/test-locked-requirements 2>&1" ts
- *run_frontend_tests
# We only need to upload coverage reports on whichever platform
# runs the frontend tests.
- *upload_coverage_report
- store_artifacts:
path: ./var/casper/
destination: casper
- store_artifacts:
path: ./var/puppeteer/
destination: puppeteer
- store_artifacts:
path: ../../../tmp/zulip-test-event-log/
destination: test-reports
- store_test_results:
path: ./var/xunit-test-results/casper/
- *notify_failure_status
"focal-backend":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *install_dependencies
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
- *run_backend_tests
- run:
name: Check development database build
command: mispipe "tools/ci/setup-backend" ts
- *notify_failure_status
"xenial-legacy":
docker:
- image: arpit551/circleci:xenial-python-test
working_directory: ~/zulip
steps:
- checkout
- *check_xenial_provision_error
- *check_xenial_upgrade_error
- *notify_failure_status
"bionic-production-build":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
- checkout
- *create_cache_directories
- *do_bionic_hack
- *restore_cache_package_json
- *restore_cache_requirements
- *restore_emoji_cache
- *build_production
- *save_cache_package_json
- *save_cache_requirements
- *save_emoji_cache
# Persist the built tarball to be used in downstream job
# for installation of production server.
# See https://circleci.com/docs/2.0/workflows/#using-workspaces-to-share-data-among-jobs
- persist_to_workspace:
# Must be an absolute path,
# or relative path from working_directory.
# This is a directory on the container which is
# taken to be the root directory of the workspace.
root: /tmp
# Must be relative path from root
paths:
- zulip-server-test.tar.gz
- success-http-headers.template.txt
- production-install
- production-verify
- production-upgrade-pg
- production
- production-extract-tarball
- *notify_failure_status
"bionic-production-install":
docker:
# This is built from tools/ci/images/bionic/Dockerfile .
# Bionic ships with Python 3.6.
- image: arpit551/circleci:bionic-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
- *do_bionic_hack
- *production_extract_tarball
- *restore_cache_package_json
- *install_production
- *verify_production
- *upgrade_postgresql
- *verify_production
- *save_cache_package_json
- *notify_failure_status
"focal-production-install":
docker:
# This is built from tools/ci/images/focal/Dockerfile.
# Focal ships with Python 3.8.2.
- image: arpit551/circleci:focal-python-test
working_directory: ~/zulip
steps:
# Contains the built tarball from bionic-production-build job
- attach_workspace:
# Must be absolute path or relative path from working_directory
at: /tmp
- *create_cache_directories
mispipe ./tools/travis/backend ts
- run:
name: do memcached hack
name: run frontend tests
command: |
# Temporary hack till memcached upstream is updated in Focal.
# https://bugs.launchpad.net/ubuntu/+source/memcached/+bug/1878721
echo "export SASL_CONF_PATH=/etc/sasl2" | sudo tee - a /etc/default/memcached
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/frontend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
# - store_artifacts: # TODO
# path: var/casper/
# # also /tmp/zulip-test-event-log/
# destination: test-reports
"xenial-python-3.5":
docker:
# This is built from tools/circleci/images/xenial/Dockerfile .
- image: gregprice/circleci:xenial-python-4.test
working_directory: ~/zulip
steps:
- checkout
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
- restore_cache:
keys:
- v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: install dependencies
command: |
sudo apt-get update
sudo apt-get install -y moreutils
rm -f /home/circleci/.gitconfig
mispipe "tools/travis/setup-backend" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.xenial-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.xenial-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/backend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
"bionic-python-3.6":
docker:
# This is built from tools/circleci/images/bionic/Dockerfile .
- image: gregprice/circleci:bionic-python-1.test
working_directory: ~/zulip
steps:
- checkout
- run:
name: create cache directories
command: |
dirs=(/srv/zulip-{npm,venv}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R circleci "${dirs[@]}"
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
redis-server --daemonize yes
- restore_cache:
keys:
- v1-npm-base.bionic-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- restore_cache:
keys:
- v1-venv-base.bionic-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: install dependencies
command: |
sudo apt-get update
sudo apt-get install -y moreutils
rm -f /home/circleci/.gitconfig
mispipe "tools/travis/setup-backend" ts
- save_cache:
paths:
- /srv/zulip-npm-cache
key: v1-npm-base.bionic-{{ checksum "package.json" }}-{{ checksum "yarn.lock" }}
- save_cache:
paths:
- /srv/zulip-venv-cache
key: v1-venv-base.bionic-{{ checksum "requirements/thumbor.txt" }}-{{ checksum "requirements/dev.txt" }}
- run:
name: run backend tests
command: |
. /srv/zulip-py3-venv/bin/activate
mispipe ./tools/travis/backend ts
- run:
name: upload coverage report
command: |
. /srv/zulip-py3-venv/bin/activate
pip install codecov && codecov \
|| echo "Error in uploading coverage reports to codecov.io."
- *production_extract_tarball
- *restore_cache_package_json
- *install_production
- *verify_production
- *save_cache_package_json
- *notify_failure_status
workflows:
version: 2
"Ubuntu 16.04 Xenial (Python 3.5, legacy)":
build:
jobs:
- "xenial-legacy"
"Ubuntu 18.04 Bionic (Python 3.6, backend+frontend)":
jobs:
- "bionic-backend-frontend"
"Ubuntu 20.04 Focal (Python 3.8, backend)":
jobs:
- "focal-backend"
"Production":
jobs:
- "bionic-production-build"
- "bionic-production-install":
requires:
- "bionic-production-build"
- "focal-production-install":
requires:
- "bionic-production-build"
- "trusty-python-3.4"
- "xenial-python-3.5"
- "bionic-python-3.6"

View File

@@ -5,8 +5,6 @@ coverage:
project:
default:
target: auto
# Codecov has the tendency to report a lot of false negatives,
# so we basically suppress comments completely.
threshold: 50%
threshold: 0.50
base: auto
patch: off

View File

@@ -6,16 +6,20 @@ charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true
[*.{sh,py,pyi,js,ts,json,xml,css,scss,hbs,html}]
[*.{sh,py,pyi,js,json,yml,xml,css,md,markdown,handlebars,html}]
indent_style = space
indent_size = 4
[*.py]
[*.{py}]
max_line_length = 110
[*.{js,ts}]
max_line_length = 100
[*.{js}]
max_line_length = 120
[*.{svg,rb,pp}]
[*.{svg,rb,pp,pl}]
indent_style = space
indent_size = 2
[*.{cfg}]
indent_style = space
indent_size = 8

View File

@@ -1,10 +1,2 @@
# This is intended for generated files and vendored third-party files.
# For our source code, instead of adding files here, consider using
# specific eslint-disable comments in the files themselves.
/docs/_build
/static/generated
/static/third
/static/webpack-bundles
/var
/zulip-py3-venv
static/js/blueslip.js
static/webpack-bundles

View File

@@ -3,125 +3,369 @@
"node": true,
"es6": true
},
"extends": [
"eslint:recommended"
],
"parserOptions": {
"ecmaVersion": 2019,
"warnOnUnsupportedTypeScriptVersion": false,
"sourceType": "module"
},
"globals": {
"$": false,
"ClipboardJS": false,
"Dict": false,
"FetchStatus": false,
"Filter": false,
"Handlebars": false,
"LightboxCanvas": false,
"MessageListData": false,
"MessageListView": false,
"PerfectScrollbar": false,
"Plotly": false,
"SockJS": false,
"Socket": false,
"Sortable": false,
"WinChan": false,
"XDate": false,
"_": false,
"activity": false,
"admin": false,
"admin_sections": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"echo": false,
"emoji": false,
"emoji_codes": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"fenced_code": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"katex": false,
"keydown_util": false,
"lightbox": false,
"list_cursor": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"markdown": false,
"marked": false,
"md5": false,
"message_edit": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"moment": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pointer": false,
"popovers": false,
"presence": false,
"pygments_data": false,
"reactions": false,
"realm_icon": false,
"recent_senders": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_filters": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"submessage": false,
"subs": false,
"tab_bar": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"toMarkdown": false,
"todo_widget": false,
"top_left_corner": false,
"topic_data": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"typing_status": false,
"ui": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_search": false,
"util": false,
"voting_widget": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
},
"plugins": [
"eslint-plugin-empty-returns"
],
"rules": {
"array-callback-return": "error",
"array-bracket-spacing": "error",
"arrow-body-style": "error",
"arrow-parens": "error",
"arrow-spacing": [ "error", { "before": true, "after": true } ],
"block-scoped-var": "error",
"block-scoped-var": 2,
"brace-style": [ "error", "1tbs", { "allowSingleLine": true } ],
"comma-dangle": [ "error", "always-multiline" ],
"comma-spacing": [ "error",
"camelcase": 0,
"comma-dangle": [ "error",
{
"before": false,
"after": true
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "never"
}
],
"curly": "error",
"dot-notation": "error",
"complexity": [ 0, 4 ],
"curly": 2,
"dot-notation": [ "error", { "allowKeywords": true } ],
"empty-returns/main": "error",
"eol-last": "error",
"eqeqeq": "error",
"guard-for-in": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": 2,
"func-style": [ "off", "expression" ],
"guard-for-in": 2,
"indent": ["error", 4, {
"ArrayExpression": "first",
"outerIIFEBody": 0,
"ObjectExpression": "first",
"SwitchCase": 0,
"CallExpression": {"arguments": "first"},
"FunctionExpression": {"parameters": "first"},
"FunctionDeclaration": {"parameters": "first"}
}],
"key-spacing": "error",
"keyword-spacing": "error",
"keyword-spacing": [ "error",
{
"before": true,
"after": true,
"overrides": {
"return": { "after": true },
"throw": { "after": true },
"case": { "after": true }
}
}
],
"max-depth": [ 0, 4 ],
"max-len": [ "error", 100, 2,
{
"ignoreUrls": true,
"ignoreComments": false,
"ignoreRegExpLiterals": true,
"ignoreStrings": true,
"ignoreTemplateLiterals": true
}
],
"max-params": [ 0, 3 ],
"max-statements": [ 0, 10 ],
"new-cap": [ "error",
{
"newIsCap": true,
"capIsNew": false
}
],
"new-parens": "error",
"no-alert": "error",
"new-parens": 2,
"newline-per-chained-call": 0,
"no-alert": 2,
"no-array-constructor": "error",
"no-bitwise": "error",
"no-caller": "error",
"no-catch-shadow": "error",
"no-constant-condition": ["error", {"checkLoops": false}],
"no-div-regex": "error",
"no-bitwise": 2,
"no-caller": 2,
"no-case-declarations": "error",
"no-catch-shadow": 2,
"no-console": 0,
"no-const-assign": "error",
"no-control-regex": 2,
"no-debugger": 2,
"no-delete-var": 2,
"no-div-regex": 2,
"no-dupe-class-members": "error",
"no-dupe-keys": 2,
"no-duplicate-imports": "error",
"no-else-return": "error",
"no-eq-null": "error",
"no-eval": "error",
"no-extra-parens": "error",
"no-floating-decimal": "error",
"no-implied-eval": "error",
"no-inner-declarations": "off",
"no-else-return": 2,
"no-empty": 2,
"no-empty-character-class": 2,
"no-eq-null": 2,
"no-eval": 2,
"no-ex-assign": 2,
"no-extra-parens": ["error", "all"],
"no-extra-semi": 2,
"no-fallthrough": 2,
"no-floating-decimal": 2,
"no-func-assign": 2,
"no-implied-eval": 2,
"no-iterator": "error",
"no-label-var": "error",
"no-labels": "error",
"no-loop-func": "error",
"no-multi-str": "error",
"no-native-reassign": "error",
"no-label-var": 2,
"no-labels": 2,
"no-loop-func": 2,
"no-mixed-requires": [ 0, false ],
"no-multi-str": 2,
"no-native-reassign": 2,
"no-nested-ternary": 0,
"no-new-func": "error",
"no-new-object": "error",
"no-new-wrappers": "error",
"no-octal-escape": "error",
"no-plusplus": "error",
"no-proto": "error",
"no-return-assign": "error",
"no-script-url": "error",
"no-self-compare": "error",
"no-sync": "error",
"no-trailing-spaces": "error",
"no-undef-init": "error",
"no-new-object": 2,
"no-new-wrappers": 2,
"no-obj-calls": 2,
"no-octal": 2,
"no-octal-escape": 2,
"no-param-reassign": 0,
"no-plusplus": 2,
"no-proto": 2,
"no-redeclare": 2,
"no-regex-spaces": 2,
"no-restricted-syntax": 0,
"no-return-assign": 2,
"no-script-url": 2,
"no-self-compare": 2,
"no-shadow": 0,
"no-sync": 2,
"no-ternary": 0,
"no-trailing-spaces": 2,
"no-undef": "error",
"no-undef-init": 2,
"no-underscore-dangle": 0,
"no-unneeded-ternary": [ "error", { "defaultAssignment": false } ],
"no-unused-expressions": "error",
"no-unreachable": 2,
"no-unused-expressions": 2,
"no-unused-vars": [ "error",
{
"vars": "local",
"args": "after-used",
"varsIgnorePattern": "print_elapsed_time|check_duplicate_ids"
}
],
"no-use-before-define": "error",
"no-use-before-define": 2,
"no-useless-constructor": "error",
// The Zulip codebase complies partially with the "no-useless-escape"
// rule; only regex expressions haven't been updated yet.
// Updated regex expressions are currently being tested in casper
// files and will decide about a potential future enforcement of this rule.
"no-useless-escape": "off",
"no-var": "error",
"space-unary-ops": "error",
"no-whitespace-before-property": "error",
"no-useless-escape": 0,
"space-unary-ops": 2,
"no-whitespace-before-property": 2,
"no-with": 2,
"one-var": [ "error", "never" ],
"prefer-arrow-callback": "error",
"padded-blocks": 0,
"prefer-const": [ "error",
{
"destructuring": "any",
"ignoreReadBeforeAssign": true
}
],
"quote-props": [ "error", "as-needed" ],
"radix": "error",
"semi": "error",
"semi-spacing": "error",
"sort-imports": "error",
"space-before-blocks": "error",
"quote-props": [ "error", "as-needed",
{
"keywords": false,
"unnecessary": true,
"numbers": false
}
],
"quotes": [ 0, "single" ],
"radix": 2,
"semi": 2,
"space-before-blocks": 2,
"space-before-function-paren": [ "error",
{
"anonymous": "always",
@@ -129,294 +373,16 @@
"asyncArrow": "always"
}
],
"space-in-parens": "error",
"space-infix-ops": "error",
"spaced-comment": "off",
"strict": "off",
"space-in-parens": 2,
"space-infix-ops": 2,
"spaced-comment": 0,
"strict": 0,
"template-curly-spacing": "error",
"unnecessary-strict": 0,
"use-isnan": 2,
"valid-typeof": [ "error", { "requireStringLiterals": true } ],
"wrap-iife": "error",
"yoda": "error"
},
"overrides": [
{
"files": [
"frontend_tests/**/*.{js,ts}",
"static/js/**/*.{js,ts}"
],
"globals": {
"$": false,
"ClipboardJS": false,
"FetchStatus": false,
"Filter": false,
"Handlebars": false,
"LightboxCanvas": false,
"MessageListData": false,
"MessageListView": false,
"Plotly": false,
"Sortable": false,
"WinChan": false,
"XDate": false,
"_": false,
"activity": false,
"admin": false,
"alert_words": false,
"alert_words_ui": false,
"attachments_ui": false,
"avatar": false,
"billing": false,
"blueslip": false,
"bot_data": false,
"bridge": false,
"buddy_data": false,
"buddy_list": false,
"channel": false,
"click_handlers": false,
"color_data": false,
"colorspace": false,
"common": false,
"components": false,
"compose": false,
"compose_actions": false,
"compose_fade": false,
"compose_pm_pill": false,
"compose_state": false,
"compose_ui": false,
"composebox_typeahead": false,
"condense": false,
"confirm_dialog": false,
"copy_and_paste": false,
"csrf_token": false,
"current_msg_list": true,
"drafts": false,
"dropdown_list_widget": false,
"echo": false,
"emoji": false,
"emoji_picker": false,
"favicon": false,
"feature_flags": false,
"feedback_widget": false,
"fenced_code": false,
"flatpickr": false,
"floating_recipient_bar": false,
"gear_menu": false,
"hash_util": false,
"hashchange": false,
"helpers": false,
"history": false,
"home_msg_list": false,
"hotspots": false,
"i18n": false,
"info_overlay": false,
"input_pill": false,
"invite": false,
"jQuery": false,
"katex": false,
"keydown_util": false,
"lightbox": false,
"list_cursor": false,
"list_render": false,
"list_util": false,
"loading": false,
"localStorage": false,
"local_message": false,
"localstorage": false,
"location": false,
"markdown": false,
"marked": false,
"md5": false,
"message_edit": false,
"message_edit_history": false,
"message_events": false,
"message_fetch": false,
"message_flags": false,
"message_list": false,
"message_live_update": false,
"message_scroll": false,
"message_store": false,
"message_util": false,
"message_viewport": false,
"moment": false,
"muting": false,
"muting_ui": false,
"narrow": false,
"narrow_state": false,
"navigate": false,
"night_mode": false,
"notifications": false,
"overlays": false,
"padded_widget": false,
"page_params": false,
"panels": false,
"people": false,
"pm_conversations": false,
"pm_list": false,
"pm_list_dom": false,
"pointer": false,
"popovers": false,
"presence": false,
"reactions": false,
"realm_icon": false,
"realm_logo": false,
"realm_night_logo": false,
"recent_senders": false,
"recent_topics": false,
"reload": false,
"reload_state": false,
"reminder": false,
"resize": false,
"rows": false,
"rtl": false,
"run_test": false,
"schema": false,
"scroll_bar": false,
"scroll_util": false,
"search": false,
"search_pill": false,
"search_pill_widget": false,
"search_suggestion": false,
"search_util": false,
"sent_messages": false,
"server_events": false,
"server_events_dispatch": false,
"settings": false,
"settings_account": false,
"settings_bots": false,
"settings_display": false,
"settings_emoji": false,
"settings_exports": false,
"settings_linkifiers": false,
"settings_invites": false,
"settings_muting": false,
"settings_notifications": false,
"settings_org": false,
"settings_panel_menu": false,
"settings_profile_fields": false,
"settings_sections": false,
"settings_streams": false,
"settings_toggle": false,
"settings_ui": false,
"settings_user_groups": false,
"settings_users": false,
"spoilers": false,
"starred_messages": false,
"stream_color": false,
"stream_create": false,
"stream_data": false,
"stream_edit": false,
"stream_events": false,
"stream_topic_history": false,
"stream_list": false,
"stream_muting": false,
"stream_popover": false,
"stream_sort": false,
"stream_ui_updates": false,
"StripeCheckout": false,
"submessage": false,
"subs": false,
"tab_bar": false,
"templates": false,
"tictactoe_widget": false,
"timerender": false,
"todo_widget": false,
"top_left_corner": false,
"topic_generator": false,
"topic_list": false,
"topic_zoom": false,
"transmit": false,
"tutorial": false,
"typeahead_helper": false,
"typing": false,
"typing_data": false,
"typing_events": false,
"ui": false,
"ui_init": false,
"ui_report": false,
"ui_util": false,
"unread": false,
"unread_ops": false,
"unread_ui": false,
"upgrade": false,
"upload": false,
"upload_widget": false,
"user_events": false,
"user_groups": false,
"user_pill": false,
"user_search": false,
"user_status": false,
"user_status_ui": false,
"poll_widget": false,
"vdom": false,
"widgetize": false,
"zcommand": false,
"zform": false,
"zxcvbn": false
}
},
{
"files": [
"frontend_tests/casper_tests/*.js",
"frontend_tests/casper_lib/*.js"
],
"rules": {
// Dont require ES features that PhantomJS doesnt support
"comma-dangle": [
"error",
{
"arrays": "always-multiline",
"objects": "always-multiline",
"imports": "always-multiline",
"exports": "always-multiline",
"functions": "never"
}
],
"no-var": "off",
"prefer-arrow-callback": "off"
}
},
{
"files": ["**/*.ts"],
"extends": [
"plugin:@typescript-eslint/recommended"
],
"parserOptions": {
"project": "tsconfig.json"
},
"rules": {
// Disable base rule to avoid conflict
"empty-returns/main": "off",
"indent": "off",
"no-extra-parens": "off",
"semi": "off",
"no-unused-vars": "off",
"no-useless-constructor": "off",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/consistent-type-assertions": "error",
"@typescript-eslint/explicit-function-return-type": ["error", { "allowExpressions": true }],
"@typescript-eslint/func-call-spacing": "error",
"@typescript-eslint/indent": "error",
"@typescript-eslint/member-delimiter-style": "error",
"@typescript-eslint/member-ordering": "error",
"@typescript-eslint/no-explicit-any": "off",
"@typescript-eslint/no-extra-parens": ["error", "all"],
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-non-null-assertion": "off",
"@typescript-eslint/no-parameter-properties": "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-unused-vars": ["error", { "varsIgnorePattern": "^_" } ],
"@typescript-eslint/no-use-before-define": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-regexp-exec": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/semi": "error",
"@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unified-signatures": "error"
}
}
]
"wrap-iife": [ "error", "outside", { "functionPrototypeMethods": false } ],
"wrap-regex": 0,
"yoda": 2
}
}

1
.gitattributes vendored
View File

@@ -11,3 +11,4 @@
*.otf binary
*.tif binary
*.ogg binary
yarn.lock binary

View File

@@ -1,30 +0,0 @@
name: "Code Scanning"
on: [push, pull_request]
jobs:
CodeQL:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v2
with:
# We must fetch at least the immediate parents so that if this is
# a pull request then we can checkout the head.
fetch-depth: 2
# If this run was triggered by a pull request event, then checkout
# the head of the pull request instead of the merge commit.
- run: git checkout HEAD^2
if: ${{ github.event_name == 'pull_request' }}
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages
# with:
# languages: go, javascript, csharp, python, cpp, java
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1

View File

@@ -1,171 +0,0 @@
name: Zulip CI
on: [push, pull_request]
defaults:
run:
shell: bash
jobs:
focal_bionic:
strategy:
matrix:
include:
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/bionic/Dockerfile
# Bionic ships with Python 3.6.
- docker_image: mepriyank/actions:bionic
name: Ubuntu 18.04 Bionic (Python 3.6, backend + frontend)
os: bionic
is_bionic: true
include_frontend_tests: true
# This docker image was created by a generated Dockerfile at:
# tools/ci/images/focal/Dockerfile
# Focal ships with Python 3.8.2.
- docker_image: mepriyank/actions:focal
name: Ubuntu 20.04 Focal (Python 3.8, backend)
os: focal
is_focal: true
include_frontend_tests: false
runs-on: ubuntu-latest
name: ${{ matrix.name }}
container: ${{ matrix.docker_image }}
env:
# GitHub Actions sets HOME to /github/home which causes
# problem later in provison and frontend test that runs
# tools/setup/postgres-init-dev-db because of the .pgpass
# location. Postgresql (psql) expects .pgpass to be at
# /home/github/.pgpass and setting home to `/home/github/`
# ensures it written there because we write it to ~/.pgpass.
HOME: /home/github/
steps:
- name: Add required permissions
run: |
# The checkout actions doesn't clone to ~/zulip or allow
# us to use the path option to clone outside the current
# /__w/zulip/zulip directory. Since this directory is owned
# by root we need to change it's ownership to allow the
# github user to clone the code here.
# Note: /__w/ is a docker volume mounted to $GITHUB_WORKSPACE
# which is /home/runner/work/.
sudo chown -R github .
# This is the GitHub Actions specific cache directory the
# the current github user must be able to access for the
# cache action to work. It is owned by root currently.
sudo chmod -R 0777 /__w/_temp/
- uses: actions/checkout@v2
- name: Create cache directories
run: |
dirs=(/srv/zulip-{npm,venv,emoji}-cache)
sudo mkdir -p "${dirs[@]}"
sudo chown -R github "${dirs[@]}"
- name: Restore node_modules cache
uses: actions/cache@v2
with:
path: /srv/zulip-npm-cache
key: v1-yarn-deps-${{ matrix.os }}-${{ hashFiles('package.json') }}-${{ hashFiles('yarn.lock') }}
restore-keys: v1-yarn-deps-${{ matrix.os }}
- name: Restore python cache
uses: actions/cache@v2
with:
path: /srv/zulip-venv-cache
key: v1-venv-${{ matrix.os }}-${{ hashFiles('requirements/thumbor-dev.txt') }}-${{ hashFiles('requirements/dev.txt') }}
restore-keys: v1-venv-${{ matrix.os }}
- name: Restore emoji cache
uses: actions/cache@v2
with:
path: /srv/zulip-emoji-cache
key: v1-emoji-${{ matrix.os }}-${{ hashFiles('tools/setup/emoji/emoji_map.json') }}-${{ hashFiles('tools/setup/emoji/build_emoji') }}-${{ hashFiles('tools/setup/emoji/emoji_setup_utils.py') }}-${{ hashFiles('tools/setup/emoji/emoji_names.py') }}-${{ hashFiles('package.json') }}
restore-keys: v1-emoji-${{ matrix.os }}
- name: Do Bionic hack
if: ${{ matrix.is_bionic }}
run: |
# Temporary hack till `sudo service redis-server start` gets fixes in Bionic. See
# https://chat.zulip.org/#narrow/stream/3-backend/topic/Ubuntu.20bionic.20CircleCI
sudo sed -i '/^bind/s/bind.*/bind 0.0.0.0/' /etc/redis/redis.conf
- name: Install dependencies
run: |
# This is the main setup job for the test suite
mispipe "tools/ci/setup-backend --skip-dev-db-build" ts
# Cleaning caches is mostly unnecessary in GitHub Actions, because
# most builds don't get to write to the cache.
# mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0 2>&1" ts
- name: Run backend tests
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/ci/backend 2>&1" ts
- name: Run frontend tests
if: ${{ matrix.include_frontend_tests }}
run: |
. /srv/zulip-py3-venv/bin/activate
mispipe "./tools/ci/frontend 2>&1" ts
- name: Test locked requirements
if: ${{ matrix.is_bionic }}
run: |
. /srv/zulip-py3-venv/bin/activate && \
mispipe "./tools/test-locked-requirements 2>&1" ts
- name: Upload coverage reports
# Only upload coverage when both frontend and backend
# tests are ran.
if: ${{ matrix.include_frontend_tests }}
run: |
# Codcov requires `.coverage` file to be stored in the
# current working directory.
mv ./var/.coverage ./.coverage
. /srv/zulip-py3-venv/bin/activate || true
# TODO: Check that the next release of codecov doesn't
# throw find error.
# codecov==2.0.16 introduced a bug which uses "find"
# for locating files which is buggy on some platforms.
# It was fixed via https://github.com/codecov/codecov-python/pull/217
# and should get automatically fixed here once it's released.
# We cannot pin the version here because we need the latest version for uploading files.
# see https://community.codecov.io/t/http-400-while-uploading-to-s3-with-python-codecov-from-travis/1428/7
pip install codecov && codecov || echo "Error in uploading coverage reports to codecov.io."
- name: Store puppeteer artifacts
if: ${{ matrix.include_frontend_tests }}
uses: actions/upload-artifact@v2
with:
name: puppeteer
path: ./var/puppeteer
# We cannot use upload-artifacts actions to upload the test
# reports from /tmp, that directory exists inside the docker
# image. Move them to ./var so we access it outside docker since
# the current directory is volume mounted outside the docker image.
- name: Move test reports to var
run: mv /tmp/zulip-test-event-log/ ./var/
- name: Store test reports
if: ${{ matrix.is_bionic }}
uses: actions/upload-artifact@v2
with:
name: test-reports
path: ./var/zulip-test-event-log/
- name: Check development database build
if: ${{ matrix.is_focal }}
run: mispipe "tools/ci/setup-backend" ts
# TODO: We need to port the notify_failure step from CircleCI
# config, however, it might be the case that GitHub Notifications
# make this unnesscary. More details on settings to configure it:
# https://help.github.com/en/github/managing-subscriptions-and-notifications-on-github/configuring-notifications#github-actions-notification-options

12
.gitignore vendored
View File

@@ -32,13 +32,7 @@ package-lock.json
/.dmypy.json
# Dockerfiles generated for CircleCI
/tools/ci/images
# Generated i18n data
/locale/en
/locale/language_options.json
/locale/language_name_map.json
/locale/*/mobile.json
/tools/circleci/images
# Static build
*.mo
@@ -48,7 +42,6 @@ npm-debug.log
/staticfiles.json
/webpack-stats-production.json
/yarn-error.log
zulip-git-version
# Test / analysis tools
.coverage
@@ -76,9 +69,6 @@ zulip.kdev4
.cache/
.eslintcache
# Core dump files
core
## Miscellaneous
# (Ideally this section is empty.)
zthumbor/thumbor_local_settings.py

View File

@@ -1,7 +1,10 @@
[settings]
src_paths = ., tools, tools/setup/emoji
multi_line_output = 3
known_third_party = zulip
include_trailing_comma = True
use_parentheses = True
line_length = 100
line_length = 79
multi_line_output = 2
balanced_wrapping = true
known_third_party = django, ujson, sqlalchemy
known_first_party = zerver, zproject, version, confirmation, zilencer, analytics, frontend_tests, scripts, corporate
sections = FUTURE, STDLIB, THIRDPARTY, FIRSTPARTY, LOCALFOLDER
lines_after_imports = 1
# See the comment related to ioloop_logging for why this is skipped.
skip = zerver/management/commands/runtornado.py

View File

@@ -1,29 +0,0 @@
Alex Vandiver <alexmv@zulip.com> <alex@chmrr.net>
Alex Vandiver <alexmv@zulip.com> <github@chmrr.net>
Aman Agrawal <amanagr@zulip.com> <f2016561@pilani.bits-pilani.ac.in>
Anders Kaseorg <anders@zulip.com> <anders@zulipchat.com>
Anders Kaseorg <anders@zulip.com> <andersk@mit.edu>
Brock Whittaker <brock@zulipchat.com> <bjwhitta@asu.edu>
Brock Whittaker <brock@zulipchat.com> <brockwhittaker@Brocks-MacBook.local>
Brock Whittaker <brock@zulipchat.com> <brock@zulipchat.org>
Chris Bobbe <cbobbe@zulip.com> <cbobbe@zulipchat.com>
Chris Bobbe <cbobbe@zulip.com> <csbobbe@gmail.com>
Greg Price <greg@zulip.com> <gnprice@gmail.com>
Greg Price <greg@zulip.com> <greg@zulipchat.com>
Greg Price <greg@zulip.com> <price@mit.edu>
Ray Kraesig <rkraesig@zulip.com> <rkraesig@zulipchat.com>
Rishi Gupta <rishig@zulip.com> <rishig+git@mit.edu>
Rishi Gupta <rishig@zulip.com> <rishig@kandralabs.com>
Rishi Gupta <rishig@zulip.com> <rishig@users.noreply.github.com>
Rishi Gupta <rishig@zulip.com> <rishig@zulipchat.com>
Steve Howell <showell@zulip.com> <showell30@yahoo.com>
Steve Howell <showell@zulip.com> <showell@yahoo.com>
Steve Howell <showell@zulip.com> <showell@zulipchat.com>
Steve Howell <showell@zulip.com> <steve@humbughq.com>
Steve Howell <showell@zulip.com> <steve@zulip.com>
Tim Abbott <tabbott@zulip.com> <tabbott@dropbox.com>
Tim Abbott <tabbott@zulip.com> <tabbott@humbughq.com>
Tim Abbott <tabbott@zulip.com> <tabbott@mit.edu>
Tim Abbott <tabbott@zulip.com> <tabbott@zulipchat.com>
Vishnu KS <yo@vishnuks.com> <hackerkid@vishnuks.com>
Vishnu KS <yo@vishnuks.com> <yo@vishnuks.com>

View File

@@ -1,14 +0,0 @@
{
"source_directories": ["."],
"taint_models_path": [
"stubs/taint",
"zulip-py3-venv/lib/pyre_check/taint/"
],
"search_path": [
"stubs/",
"zulip-py3-venv/lib/pyre_check/stubs/"
],
"exclude": [
"/srv/zulip/zulip-py3-venv/.*"
]
}

View File

@@ -1 +0,0 @@
sonar.inclusions=**/*.py,**/*.html

View File

@@ -62,6 +62,5 @@
# Limit language features
"color-no-hex": true,
"color-named": "never",
}
}

67
.travis.yml Normal file
View File

@@ -0,0 +1,67 @@
# See https://zulip.readthedocs.io/en/latest/testing/travis.html for
# high-level documentation on our Travis CI setup.
dist: trusty
group: deprecated-2017Q4
install:
# Disable sometimes-broken sources.list in Travis base images
- sudo rm -vf /etc/apt/sources.list.d/*
- sudo apt-get update
# Disable Travis CI's built-in NVM installation
- mispipe "mv ~/.nvm ~/.travis-nvm-disabled" ts
# Install codecov, the library for the code coverage reporting tool we use
# With a retry to minimize impact of transient networking errors.
- mispipe "pip install codecov" ts || mispipe "pip install codecov" ts
# This is the main setup job for the test suite
- mispipe "tools/travis/setup-$TEST_SUITE" ts
# Clean any caches that are not in use to avoid our cache
# becoming huge.
- mispipe "scripts/lib/clean-unused-caches --verbose --threshold 0" ts
script:
# We unset GEM_PATH here as a hack to work around Travis CI having
# broken running their system puppet with Ruby. See
# https://travis-ci.org/zulip/zulip/jobs/240120991 for an example traceback.
- unset GEM_PATH
- mispipe "./tools/travis/$TEST_SUITE" ts
cache:
yarn: true
apt: false
directories:
- $HOME/zulip-venv-cache
- $HOME/zulip-npm-cache
- $HOME/zulip-emoji-cache
- $HOME/node
- $HOME/misc
env:
global:
- BOTO_CONFIG=/tmp/nowhere
language: python
# Our test suites generally run on Python 3.4, the version in
# Ubuntu 14.04 trusty, which is the oldest OS release we support.
matrix:
include:
# Travis will actually run the jobs in the order they're listed here;
# that doesn't seem to be documented, but it's what we see empirically.
# We only get 4 jobs running at a time, so we try to make the first few
# the most likely to break.
- python: "3.4"
env: TEST_SUITE=production
# Other suites moved to CircleCI -- see .circleci/.
sudo: required
addons:
artifacts:
paths:
# Casper debugging data (screenshots, etc.) is super useful for
# debugging test flakes.
- $(ls var/casper/* | tr "\n" ":")
- $(ls /tmp/zulip-test-event-log/* | tr "\n" ":")
postgresql: "9.3"
apt:
packages:
- moreutils
after_success:
- codecov

View File

@@ -3,31 +3,31 @@ host = https://www.transifex.com
lang_map = zh-Hans: zh_Hans, zh-Hant: zh_Hant
[zulip.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_file = static/locale/en/LC_MESSAGES/django.po
source_lang = en
type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip.translationsjson]
file_filter = locale/<lang>/translations.json
source_file = locale/en/translations.json
source_file = static/locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json
[zulip.mobile]
file_filter = locale/<lang>/mobile.json
source_file = locale/en/mobile.json
source_file = static/locale/en/mobile.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/mobile.json
[zulip-test.djangopo]
file_filter = locale/<lang>/LC_MESSAGES/django.po
source_file = locale/en/LC_MESSAGES/django.po
source_file = static/locale/en/LC_MESSAGES/django.po
source_lang = en
type = PO
file_filter = static/locale/<lang>/LC_MESSAGES/django.po
[zulip-test.translationsjson]
file_filter = locale/<lang>/translations.json
source_file = locale/en/translations.json
source_file = static/locale/en/translations.json
source_lang = en
type = KEYVALUEJSON
file_filter = static/locale/<lang>/translations.json

View File

@@ -1 +0,0 @@
ignore-scripts true

View File

@@ -78,7 +78,7 @@ something you can do while a violation is happening, do it. A lot of the
harms of harassment and other violations can be mitigated by the victim
knowing that the other people present are on their side.
All reports will be kept confidential. In some cases, we may determine that a
All reports will be kept confidential. In some cases we may determine that a
public statement will need to be made. In such cases, the identities of all
victims and reporters will remain confidential unless those individuals
instruct us otherwise.
@@ -101,5 +101,5 @@ This Code of Conduct is adapted from the
[Citizen Code of Conduct](http://citizencodeofconduct.org/) and the
[Django Code of Conduct](https://www.djangoproject.com/conduct/), and is
under a
[Creative Commons BY-SA](https://creativecommons.org/licenses/by-sa/4.0/)
[Creative Commons BY-SA](http://creativecommons.org/licenses/by-sa/4.0/)
license.

View File

@@ -13,8 +13,7 @@ user, or anything else. Make sure to read the
before posting. The Zulip community is also governed by a
[code of conduct](https://zulip.readthedocs.io/en/latest/code-of-conduct.html).
You can subscribe to zulip-devel-announce@googlegroups.com or our
[Twitter](https://twitter.com/zulip) account for a lower traffic (~1
You can subscribe to zulip-devel@googlegroups.com for a lower traffic (~1
email/month) way to hear about things like mentorship opportunities with Google
Code-in, in-person sprints at conferences, and other opportunities to
contribute.
@@ -29,11 +28,11 @@ needs doing:
[backend](https://github.com/zulip/zulip), web
[frontend](https://github.com/zulip/zulip), React Native
[mobile app](https://github.com/zulip/zulip-mobile), or Electron
[desktop app](https://github.com/zulip/zulip-desktop).
[desktop app](https://github.com/zulip/zulip-electron).
* Building out our
[Python API and bots](https://github.com/zulip/python-zulip-api) framework.
* [Writing an integration](https://zulip.com/api/integrations-overview).
* Improving our [user](https://zulip.com/help/) or
* [Writing an integration](https://zulipchat.com/api/integrations-overview).
* Improving our [user](https://zulipchat.com/help/) or
[developer](https://zulip.readthedocs.io/en/latest/) documentation.
* [Reviewing code](https://zulip.readthedocs.io/en/latest/contributing/code-reviewing.html)
and manually testing pull requests.
@@ -47,7 +46,7 @@ don't require touching the codebase at all. We list a few of them below:
* [Translating](https://zulip.readthedocs.io/en/latest/translating/translating.html)
Zulip.
* [Outreach](#zulip-outreach): Star us on GitHub, upvote us
on product comparison sites, or write for [the Zulip blog](https://blog.zulip.org/).
on product comparison sites, or write for [the Zulip blog](http://blog.zulip.org/).
## Your first (codebase) contribution
@@ -70,11 +69,12 @@ to help.
if you run into any troubles.
* Read the
[Zulip guide to Git](https://zulip.readthedocs.io/en/latest/git/index.html)
and do the Git tutorial (coming soon) if you are unfamiliar with
Git, getting help in
[#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if
you run into any troubles. Be sure to check out the
[extremely useful Zulip-specific tools page](https://zulip.readthedocs.io/en/latest/git/zulip-tools.html).
and do the Git tutorial (coming soon) if you are unfamiliar with Git,
getting help in
[#git help](https://chat.zulip.org/#narrow/stream/44-git-help) if you run
into any troubles.
* Sign the
[Dropbox Contributor License Agreement](https://opensource.dropbox.com/cla/).
### Picking an issue
@@ -84,19 +84,17 @@ on.
* If you're interested in
[mobile](https://github.com/zulip/zulip-mobile/issues?q=is%3Aopen+is%3Aissue),
[desktop](https://github.com/zulip/zulip-desktop/issues?q=is%3Aopen+is%3Aissue),
[desktop](https://github.com/zulip/zulip-electron/issues?q=is%3Aopen+is%3Aissue),
or
[bots](https://github.com/zulip/python-zulip-api/issues?q=is%3Aopen+is%3Aissue)
development, check the respective links for open issues, or post in
[#mobile](https://chat.zulip.org/#narrow/stream/48-mobile),
[#desktop](https://chat.zulip.org/#narrow/stream/16-desktop), or
[#integration](https://chat.zulip.org/#narrow/stream/127-integrations).
* For the main server and web repository, we recommend browsing
recently opened issues to look for issues you are confident you can
fix correctly in a way that clearly communicates why your changes
are the correct fix. Our GitHub workflow bot, zulipbot, limits
users who have 0 commits merged to claiming a single issue labeled
with "good first issue" or "help wanted".
* For the main server and web repository, start by looking through issues
with the label
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A"good+first+issue").
These are smaller projects particularly suitable for a first contribution.
* We also partition all of our issues in the main repo into areas like
admin, compose, emoji, hotkeys, i18n, onboarding, search, etc. Look
through our [list of labels](https://github.com/zulip/zulip/labels), and
@@ -106,8 +104,8 @@ on.
[#new members](https://chat.zulip.org/#narrow/stream/95-new-members) with a
bit about your background and interests, and we'll help you out. The most
important thing to say is whether you're looking for a backend (Python),
frontend (JavaScript and TypeScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript/TypeScript + CSS) issue, and a
frontend (JavaScript), mobile (React Native), desktop (Electron),
documentation (English) or visual design (JavaScript + CSS) issue, and a
bit about your programming experience and available time.
We also welcome suggestions of features that you feel would be valuable or
@@ -120,17 +118,11 @@ Other notes:
* For a first pull request, it's better to aim for a smaller contribution
than a bigger one. Many first contributions have fewer than 10 lines of
changes (not counting changes to tests).
* The full list of issues explicitly looking for a contributor can be
found with the
* The full list of issues looking for a contributor can be found with the
[good first issue](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22good+first+issue%22)
and
[help wanted](https://github.com/zulip/zulip/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
labels. Avoid issues with the "difficult" label unless you
understand why it is difficult and are confident you can resolve the
issue correctly and completely. Issues without one of these labels
are fair game if Tim has written a clear technical design proposal
in the issue, or it is a bug that you can reproduce and you are
confident you can fix the issue correctly.
labels.
* For most new contributors, there's a lot to learn while making your first
pull request. It's OK if it takes you a while; that's normal! You'll be
able to work a lot faster as you build experience.
@@ -170,8 +162,9 @@ labels.
## What makes a great Zulip contributor?
Zulip has a lot of experience working with new contributors. In our
experience, these are the best predictors of success:
Zulip runs a lot of [internship programs](#internship-programs), so we have
a lot of experience with new contributors. In our experience, these are the
best predictors of success:
* Posting good questions. This generally means explaining your current
understanding, saying what you've done or tried so far, and including
@@ -191,8 +184,8 @@ experience, these are the best predictors of success:
able to address things within a few days.
* Being helpful and friendly on chat.zulip.org.
These are also the main criteria we use to select candidates for all
of our outreach programs.
These are also the main criteria we use to select interns for all of our
internship programs.
## Reporting issues
@@ -213,9 +206,8 @@ and how to reproduce it if known, your browser/OS if relevant, and a
if appropriate.
**Reporting security issues**. Please do not report security issues
publicly, including on public streams on chat.zulip.org. You can
email security@zulip.com. We create a CVE for every security
issue in our released software.
publicly, including on public streams on chat.zulip.org. You can email
zulip-security@googlegroups.com. We create a CVE for every security issue.
## User feedback
@@ -230,7 +222,7 @@ to:
* Pros and cons: What are the pros and cons of Zulip for your organization,
and the pros and cons of other products you are evaluating?
* Features: What are the features that are most important for your
organization? In the best-case scenario, what would your chat solution do
organization? In the best case scenario, what would your chat solution do
for you?
* Onboarding: If you remember it, what was your impression during your first
few minutes of using Zulip? What did you notice, and how did you feel? Was
@@ -238,20 +230,21 @@ to:
* Organization: What does your organization do? How big is the organization?
A link to your organization's website?
## Outreach programs
## Internship programs
Zulip participates in [Google Summer of Code
(GSoC)](https://developers.google.com/open-source/gsoc/) every year.
In the past, we've also participated in
[Outreachy](https://www.outreachy.org/), [Google
Code-In](https://developers.google.com/open-source/gci/), and hosted
summer interns from Harvard, MIT, and Stanford.
Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code (GSoC)](https://developers.google.com/open-source/gsoc/)
[1], and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram),
and has in the past taken summer interns from Harvard, MIT, and
Stanford.
While each third-party program has its own rules and requirements, the
Zulip community's approaches all of these programs with these ideas in
mind:
* We try to make the application process as valuable for the applicant as
possible. Expect high-quality code reviews, a supportive community, and
possible. Expect high quality code reviews, a supportive community, and
publicly viewable patches you can link to from your resume, regardless of
whether you are selected.
* To apply, you'll have to submit at least one pull request to a Zulip
@@ -265,22 +258,26 @@ mind:
application to makes mistakes in your first few PRs as long as your
work improves.
Most of our outreach program participants end up sticking around the
project long-term, and many have become core team members, maintaining
important parts of the project. We hope you apply!
Zulip also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). Our
selection criteria for Finalists and Grand Prize Winners is the same as our
selection criteria for interns above.
Most of our interns end up sticking around the project long-term, and many
quickly become core team members. We hope you apply!
### Google Summer of Code
The largest outreach program Zulip participates in is GSoC (14
students in 2017; 11 in 2018; 17 in 2019). While we don't control how
many slots Google allocates to Zulip, we hope to mentor a similar
number of students in future summers.
GSoC is by far the largest of our internship programs (we had 14 GSoC
students in summer 2017). While we don't control how many slots
Google allocates to Zulip, we hope to mentor a similar number of
students in 2018.
If you're reading this well before the application deadline and want
to make your application strong, we recommend getting involved in the
community and fixing issues in Zulip now. Having good contributions
and building a reputation for doing good work is the best way to have
a strong application. About half of Zulip's GSoC students for Summer
and building a reputation for doing good work is best way to have a
strong application. About half of Zulip's GSoC students for Summer
2017 had made significant contributions to the project by February
2017, and about half had not. Our
[GSoC project ideas page][gsoc-guide] has lots more details on how
@@ -299,6 +296,10 @@ for ZSoC, we'll contact you when the GSoC results are announced.
[gsoc-guide]: https://zulip.readthedocs.io/en/latest/overview/gsoc-ideas.html
[gsoc-faq]: https://developers.google.com/open-source/gsoc/faq
[1] Formally, [GSoC isn't an internship][gsoc-faq], but it is similar
enough that we're treating it as such for the purposes of this
documentation.
## Zulip Outreach
**Upvoting Zulip**. Upvotes and reviews make a big difference in the public
@@ -308,7 +309,7 @@ list typically takes about 15 minutes.
* Star us on GitHub. There are four main repositories:
[server/web](https://github.com/zulip/zulip),
[mobile](https://github.com/zulip/zulip-mobile),
[desktop](https://github.com/zulip/zulip-desktop), and
[desktop](https://github.com/zulip/zulip-electron), and
[Python API](https://github.com/zulip/python-zulip-api).
* [Follow us](https://twitter.com/zulip) on Twitter.
@@ -333,7 +334,7 @@ have been using Zulip for a while and want to contribute more.
about a technical aspect of Zulip can be a great way to spread the word
about Zulip.
We also occasionally [publish](https://blog.zulip.org/) long-form
We also occasionally [publish](http://blog.zulip.org/) longer form
articles related to Zulip. Our posts typically get tens of thousands
of views, and we always have good ideas for blog posts that we can
outline but don't have time to write. If you are an experienced writer

17
Dockerfile-dev Normal file
View File

@@ -0,0 +1,17 @@
FROM ubuntu:trusty
EXPOSE 9991
RUN apt-get update && apt-get install -y wget
RUN localedef -i en_US -f UTF-8 en_US.UTF-8
RUN useradd -d /home/zulip -m zulip && echo 'zulip ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers
USER zulip
RUN ln -nsf /srv/zulip ~/zulip
RUN echo 'export LC_ALL="en_US.UTF-8" LANG="en_US.UTF-8" LANGUAGE="en_US.UTF-8"' >> ~zulip/.bashrc
WORKDIR /srv/zulip

View File

@@ -1,15 +1,42 @@
# To build run `docker build -f Dockerfile-postgresql .` from the root of the
# zulip repo.
# Currently the postgres images do not support automatic upgrading of
# Install build tools and build tsearch_extras for the current postgres
# version. Currently the postgres images do not support automatic upgrading of
# the on-disk data in volumes. So the base image can not currently be upgraded
# without users needing a manual pgdump and restore.
FROM postgres:10
RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y \
postgresql-server-dev-$PG_MAJOR \
postgresql-server-dev-all \
git \
build-essential \
fakeroot \
devscripts
RUN git clone https://github.com/zulip/tsearch_extras.git \
&& cd tsearch_extras \
&& echo $PG_MAJOR > debian/pgversions \
&& pg_buildext updatecontrol \
&& debuild -b -uc -us
# Install hunspell, zulip stop words, and run zulip database
# Install tsearch_extras, hunspell, zulip stop words, and run zulip database
# init.
FROM groonga/pgroonga:latest-alpine-10-slim
RUN apk add -U --no-cache hunspell-en
RUN ln -sf /usr/share/hunspell/en_US.dic /usr/local/share/postgresql/tsearch_data/en_us.dict && ln -sf /usr/share/hunspell/en_US.aff /usr/local/share/postgresql/tsearch_data/en_us.affix
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/local/share/postgresql/tsearch_data/zulip_english.stop
COPY scripts/setup/create-db.sql /docker-entrypoint-initdb.d/zulip-create-db.sql
COPY scripts/setup/create-pgroonga.sql /docker-entrypoint-initdb.d/zulip-create-pgroonga.sql
FROM postgres:10
ENV TSEARCH_EXTRAS_VERSION=0.4
ENV TSEARCH_EXTRAS_DEB=postgresql-${PG_MAJOR}-tsearch-extras_${TSEARCH_EXTRAS_VERSION}_amd64.deb
COPY --from=0 /${TSEARCH_EXTRAS_DEB} /tmp
COPY puppet/zulip/files/postgresql/zulip_english.stop /usr/share/postgresql/$PG_MAJOR/tsearch_data/zulip_english.stop
COPY scripts/setup/postgres-create-db /docker-entrypoint-initdb.d/postgres-create-db.sh
COPY scripts/setup/pgroonga-debian.asc /tmp
RUN apt-key add /tmp/pgroonga-debian.asc \
&& echo "deb http://packages.groonga.org/debian/ stretch main" > /etc/apt/sources.list.d/zulip.list \
&& apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install --no-install-recommends -y \
hunspell-en-us \
postgresql-${PG_MAJOR}-pgroonga \
&& DEBIAN_FRONTEND=noninteractive dpkg -i /tmp/${TSEARCH_EXTRAS_DEB} \
&& rm /tmp/${TSEARCH_EXTRAS_DEB} \
&& ln -sf /var/cache/postgresql/dicts/en_us.dict "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.dict" \
&& ln -sf /var/cache/postgresql/dicts/en_us.affix "/usr/share/postgresql/$PG_MAJOR/tsearch_data/en_us.affix" \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,4 +1,4 @@
Copyright 2011-2020 Dropbox, Inc., Kandra Labs, Inc., and contributors
Copyright 2011-2018 Dropbox, Inc., Kandra Labs, Inc., and contributors
Apache License
Version 2.0, January 2004

View File

@@ -5,13 +5,13 @@ immediacy of real-time chat with the productivity benefits of threaded
conversations. Zulip is used by open source projects, Fortune 500 companies,
large standards bodies, and others who need a real-time chat system that
allows users to easily process hundreds or thousands of messages a day. With
over 500 contributors merging over 500 commits a month, Zulip is also the
over 300 contributors merging over 500 commits a month, Zulip is also the
largest and fastest growing open source group chat project.
[![CircleCI branch](https://img.shields.io/circleci/project/github/zulip/zulip/master.svg)](https://circleci.com/gh/zulip/zulip/tree/master)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip/master.svg)](https://codecov.io/gh/zulip/zulip/branch/master)
[![CircleCI Build Status](https://circleci.com/gh/zulip/zulip.svg?style=svg)](https://circleci.com/gh/zulip/zulip)
[![Travis Build Status](https://travis-ci.org/zulip/zulip.svg?branch=master)](https://travis-ci.org/zulip/zulip)
[![Coverage Status](https://img.shields.io/codecov/c/github/zulip/zulip.svg)](https://codecov.io/gh/zulip/zulip)
[![Mypy coverage](https://img.shields.io/badge/mypy-100%25-green.svg)][mypy-coverage]
[![GitHub release](https://img.shields.io/github/release/zulip/zulip.svg)](https://github.com/zulip/zulip/releases/latest)
[![docs](https://readthedocs.org/projects/zulip/badge/?version=latest)](https://zulip.readthedocs.io/en/latest/)
[![Zulip chat](https://img.shields.io/badge/zulip-join_chat-brightgreen.svg)](https://chat.zulip.org)
[![Twitter](https://img.shields.io/badge/twitter-@zulip-blue.svg?style=flat)](https://twitter.com/zulip)
@@ -29,12 +29,12 @@ You might be interested in:
* **Contributing code**. Check out our
[guide for new contributors](https://zulip.readthedocs.io/en/latest/overview/contributing.html)
to get started. Zulip prides itself on maintaining a clean and
to get started. Zulip prides itself on maintaining a clean and
well-tested codebase, and a stock of hundreds of
[beginner-friendly issues][beginner-friendly].
* **Contributing non-code**.
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issues),
[Report an issue](https://zulip.readthedocs.io/en/latest/overview/contributing.html#reporting-issue),
[translate](https://zulip.readthedocs.io/en/latest/translating/translating.html) Zulip
into your language,
[write](https://zulip.readthedocs.io/en/latest/overview/contributing.html#zulip-outreach)
@@ -51,25 +51,31 @@ You might be interested in:
the
[Zulip community server](https://zulip.readthedocs.io/en/latest/contributing/chat-zulip-org.html). We
also recommend reading Zulip for
[open source](https://zulip.com/for/open-source/), Zulip for
[companies](https://zulip.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulip.com/for/working-groups-and-communities/).
[open source](https://zulipchat.com/for/open-source/), Zulip for
[companies](https://zulipchat.com/for/companies/), or Zulip for
[working groups and part time communities](https://zulipchat.com/for/working-groups-and-communities/).
* **Running a Zulip server**. Use a preconfigured [Digital Ocean droplet](https://marketplace.digitalocean.com/apps/zulip),
[install Zulip](https://zulip.readthedocs.io/en/stable/production/install.html)
directly, or use Zulip's
experimental [Docker image](https://zulip.readthedocs.io/en/latest/production/deployment.html#zulip-in-docker).
Commercial support is available; see <https://zulip.com/plans> for details.
* **Running a Zulip server**. Setting up a server takes just a couple
of minutes. Zulip runs on Ubuntu 18.04 Bionic, Ubuntu 16.04 Xenial,
Ubuntu 14.04 Trusty, and Debian 9 Stretch. The installation process is
[documented here](https://zulip.readthedocs.io/en/stable/prod.html).
Commercial support is available; see <https://zulipchat.com/plans>
for details.
* **Using Zulip without setting up a server**. <https://zulip.com>
offers free and commercial hosting, including providing our paid
plan for free to fellow open source projects.
* **Using Zulip without setting up a server**. <https://zulipchat.com> offers
free and commercial hosting.
* **Participating in [outreach
programs](https://zulip.readthedocs.io/en/latest/overview/contributing.html#outreach-programs)**
like Google Summer of Code.
* **Applying for a Zulip internship**. Zulip runs internship programs with
[Outreachy](https://www.outreachy.org/),
[Google Summer of Code](https://developers.google.com/open-source/gsoc/),
and the
[MIT Externship program](https://alum.mit.edu/students/NetworkwithAlumni/ExternshipProgram). Zulip
also participates in
[Google Code-In](https://developers.google.com/open-source/gci/). More
information is available
[here](https://zulip.readthedocs.io/en/latest/overview/contributing.html#internship-programs).
You may also be interested in reading our [blog](https://blog.zulip.org/) or
You may also be interested in reading our [blog](http://blog.zulip.org/) or
following us on [twitter](https://twitter.com/zulip).
Zulip is distributed under the
[Apache 2.0](https://github.com/zulip/zulip/blob/master/LICENSE) license.

View File

@@ -1,28 +0,0 @@
# Security Policy
Security announcements are sent to zulip-announce@googlegroups.com,
so you should subscribe if you are running Zulip in production.
## Reporting a Vulnerability
We love responsible reports of (potential) security issues in Zulip,
whether in the latest release or our development branch.
Our security contact is security@zulip.com. Reporters should expect a
response within 24 hours.
Please include details on the issue and how you'd like to be credited
in our release notes when we publish the fix.
Our [security
model](https://zulip.readthedocs.io/en/latest/production/security-model.html)
document may be a helpful resource.
## Supported Versions
Zulip provides security support for the latest major release, in the
form of minor security/maintenance releases.
We work hard to make
[upgrades](https://zulip.readthedocs.io/en/latest/production/upgrade-or-modify.html#upgrading-to-a-release)
reliable, so that there's no reason to run older major releases.

129
Vagrantfile vendored
View File

@@ -19,6 +19,43 @@ if Vagrant::VERSION == "1.8.7" then
end
end
# Workaround: the lxc-config in vagrant-lxc is incompatible with changes in
# LXC 2.1.0, found in Ubuntu 17.10 artful. LXC 2.1.1 (in 18.04 LTS bionic)
# ignores the old config key, so this will only be needed for artful.
#
# vagrant-lxc upstream has an attempted fix:
# https://github.com/fgrehm/vagrant-lxc/issues/445
# but it didn't work in our testing. This is a temporary issue, so we just
# hack in a fix: we patch the skeleton `lxc-config` file right in the
# distribution of the vagrant-lxc "box" we use. If the user doesn't yet
# have the box (e.g. on first setup), Vagrant would download it but too
# late for us to patch it like this; so we prompt them to explicitly add it
# first and then rerun.
if ['up', 'provision'].include? ARGV[0]
if command? "lxc-ls"
LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if LXC_VERSION == "2.1.0"
lxc_config_file = ENV['HOME'] + "/.vagrant.d/boxes/fgrehm-VAGRANTSLASH-trusty64-lxc/1.2.0/lxc/lxc-config"
if File.file?(lxc_config_file)
lines = File.readlines(lxc_config_file)
deprecated_line = "lxc.pivotdir = lxc_putold\n"
if lines[1] == deprecated_line
lines[1] = "# #{deprecated_line}"
File.open(lxc_config_file, 'w') do |f|
f.puts(lines)
end
end
else
puts 'You are running LXC 2.1.0, and fgrehm/trusty64-lxc box is incompatible '\
"with it by default. First add the box by doing:\n"\
" vagrant box add https://vagrantcloud.com/fgrehm/trusty64-lxc\n"\
'Once this command succeeds, do "vagrant up" again.'
exit
end
end
end
end
# Workaround: Vagrant removed the atlas.hashicorp.com to
# vagrantcloud.com redirect in February 2018. The value of
# DEFAULT_SERVER_URL in Vagrant versions less than 1.9.3 is
@@ -29,38 +66,24 @@ if Vagrant::DEFAULT_SERVER_URL == "atlas.hashicorp.com"
Vagrant::DEFAULT_SERVER_URL.replace('https://vagrantcloud.com')
end
# Monkey patch https://github.com/hashicorp/vagrant/pull/10879 so we
# can fall back to another provider if docker is not installed.
begin
require Vagrant.source_root.join("plugins", "providers", "docker", "provider")
rescue LoadError
else
VagrantPlugins::DockerProvider::Provider.class_eval do
method(:usable?).owner == singleton_class or def self.usable?(raise_error=false)
VagrantPlugins::DockerProvider::Driver.new.execute("docker", "version")
true
rescue Vagrant::Errors::CommandUnavailable, VagrantPlugins::DockerProvider::Errors::ExecuteError
raise if raise_error
return false
end
end
end
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
# For LXC. VirtualBox hosts use a different box, described below.
config.vm.box = "fgrehm/trusty64-lxc"
# The Zulip development environment runs on 9991 on the guest.
host_port = 9991
http_proxy = https_proxy = no_proxy = nil
host_ip_addr = "127.0.0.1"
# System settings for the virtual machine.
vm_num_cpus = "2"
vm_memory = "2048"
ubuntu_mirror = ""
config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.synced_folder ".", "/srv/zulip"
if (/darwin/ =~ RUBY_PLATFORM) != nil
config.vm.synced_folder ".", "/srv/zulip", type: "nfs",
linux__nfs_options: ['rw']
config.vm.network "private_network", type: "dhcp"
else
config.vm.synced_folder ".", "/srv/zulip"
end
vagrant_config_file = ENV['HOME'] + "/.zulip-vagrant-config"
if File.file?(vagrant_config_file)
@@ -74,9 +97,6 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
when "NO_PROXY"; no_proxy = value
when "HOST_PORT"; host_port = value.to_i
when "HOST_IP_ADDR"; host_ip_addr = value
when "GUEST_CPUS"; vm_num_cpus = value
when "GUEST_MEMORY_MB"; vm_memory = value
when "UBUNTU_MIRROR"; ubuntu_mirror = value
end
end
end
@@ -102,22 +122,35 @@ Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.network "forwarded_port", guest: 9991, host: host_port, host_ip: host_ip_addr
config.vm.network "forwarded_port", guest: 9994, host: host_port + 3, host_ip: host_ip_addr
# Specify Docker provider before VirtualBox provider so it's preferred.
config.vm.provider "docker" do |d, override|
d.build_dir = File.join(__dir__, "tools", "setup", "dev-vagrant-docker")
d.build_args = ["--build-arg", "VAGRANT_UID=#{Process.uid}"]
if !ubuntu_mirror.empty?
d.build_args += ["--build-arg", "UBUNTU_MIRROR=#{ubuntu_mirror}"]
# Specify LXC provider before VirtualBox provider so it's preferred.
config.vm.provider "lxc" do |lxc|
if command? "lxc-ls"
LXC_VERSION = `lxc-ls --version`.strip unless defined? LXC_VERSION
if LXC_VERSION >= "1.1.0" and LXC_VERSION < "3.0.0"
# Allow start without AppArmor, otherwise Box will not Start on Ubuntu 14.10
# see https://github.com/fgrehm/vagrant-lxc/issues/333
lxc.customize 'aa_allow_incomplete', 1
end
if LXC_VERSION >= "3.0.0"
lxc.customize 'apparmor.allow_incomplete', 1
end
if LXC_VERSION >= "2.0.0"
lxc.backingstore = 'dir'
end
end
d.has_ssh = true
d.create_args = ["--ulimit", "nofile=1024:65536"]
end
config.vm.provider "virtualbox" do |vb, override|
override.vm.box = "hashicorp/bionic64"
override.vm.box = "ubuntu/trusty64"
# It's possible we can get away with just 1.5GB; more testing needed
vb.memory = vm_memory
vb.cpus = vm_num_cpus
vb.memory = 2048
vb.cpus = 2
end
config.vm.provider "vmware_fusion" do |vb, override|
override.vm.box = "puphpet/ubuntu1404-x64"
vb.vmx["memsize"] = "2048"
vb.vmx["numvcpus"] = "2"
end
$provision_script = <<SCRIPT
@@ -129,15 +162,19 @@ set -o pipefail
# something that we don't want to happen when running provision in a
# development environment not using Vagrant.
# Set the Ubuntu mirror
[ ! '#{ubuntu_mirror}' ] || sudo sed -i 's|http://\\(\\w*\\.\\)*archive\\.ubuntu\\.com/ubuntu/\\? |#{ubuntu_mirror} |' /etc/apt/sources.list
# Set the MOTD on the system to have Zulip instructions
sudo ln -nsf /srv/zulip/tools/setup/dev-motd /etc/update-motd.d/99-zulip-dev
sudo rm -f /etc/update-motd.d/10-help-text
sudo dpkg --purge landscape-client landscape-common ubuntu-release-upgrader-core update-manager-core update-notifier-common ubuntu-server
sudo dpkg-divert --add --rename /etc/default/motd-news
sudo sh -c 'echo ENABLED=0 > /etc/default/motd-news'
sudo rm -f /etc/update-motd.d/*
sudo bash -c 'cat << EndOfMessage > /etc/motd
Welcome to the Zulip development environment! Popular commands:
* tools/provision - Update the development environment
* tools/run-dev.py - Run the development server
* tools/lint - Run the linter (quick and catches many problems)
* tools/test-* - Run tests (use --help to learn about options)
Read https://zulip.readthedocs.io/en/latest/testing/testing.html to learn
how to run individual test suites so that you can get a fast debug cycle.
EndOfMessage'
# If the host is running SELinux remount the /sys/fs/selinux directory as read only,
# needed for apt-get to work.

View File

@@ -1,35 +1,22 @@
import logging
import time
from collections import OrderedDict, defaultdict
from datetime import datetime, timedelta
from typing import Callable, Dict, Optional, Sequence, Tuple, Type, Union
import logging
from typing import Any, Callable, Dict, List, \
Optional, Tuple, Type, Union
from django.conf import settings
from django.db import connection
from django.db import connection, models
from django.db.models import F
from psycopg2.sql import SQL, Composable, Identifier, Literal
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
last_successful_fill,
)
from analytics.models import Anomaly, BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch, last_successful_fill
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, floor_to_hour, verify_UTC
from zerver.models import (
Message,
Realm,
RealmAuditLog,
Stream,
UserActivityInterval,
UserProfile,
models,
)
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, floor_to_hour, verify_UTC
from zerver.models import Message, Realm, \
Stream, UserActivityInterval, UserProfile, models
## Logging setup ##
@@ -52,7 +39,7 @@ class CountStat:
self.data_collector = data_collector
# might have to do something different for bitfields
if frequency not in self.FREQUENCIES:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
self.frequency = frequency
if interval is not None:
self.interval = interval
@@ -62,7 +49,7 @@ class CountStat:
self.interval = timedelta(days=1)
def __str__(self) -> str:
return f"<CountStat: {self.property}>"
return "<CountStat: %s>" % (self.property,)
class LoggingCountStat(CountStat):
def __init__(self, property: str, output_table: Type[BaseCount], frequency: str) -> None:
@@ -70,39 +57,29 @@ class LoggingCountStat(CountStat):
class DependentCountStat(CountStat):
def __init__(self, property: str, data_collector: 'DataCollector', frequency: str,
interval: Optional[timedelta] = None, dependencies: Sequence[str] = []) -> None:
interval: Optional[timedelta]=None, dependencies: List[str]=[]) -> None:
CountStat.__init__(self, property, data_collector, frequency, interval=interval)
self.dependencies = dependencies
class DataCollector:
def __init__(self, output_table: Type[BaseCount],
pull_function: Optional[Callable[[str, datetime, datetime, Optional[Realm]], int]]) -> None:
pull_function: Optional[Callable[[str, datetime, datetime], int]]) -> None:
self.output_table = output_table
self.pull_function = pull_function
## CountStat-level operations ##
def process_count_stat(stat: CountStat, fill_to_time: datetime,
realm: Optional[Realm]=None) -> None:
# TODO: The realm argument is not yet supported, in that we don't
# have a solution for how to update FillState if it is passed. It
# exists solely as partial plumbing for when we do fully implement
# doing single-realm analytics runs for use cases like data import.
#
# Also, note that for the realm argument to be properly supported,
# the CountStat object passed in needs to have come from
# E.g. get_count_stats(realm), i.e. have the realm_id already
# entered into the SQL query defined by the CountState object.
def process_count_stat(stat: CountStat, fill_to_time: datetime) -> None:
if stat.frequency == CountStat.HOUR:
time_increment = timedelta(hours=1)
elif stat.frequency == CountStat.DAY:
time_increment = timedelta(days=1)
else:
raise AssertionError(f"Unknown frequency: {stat.frequency}")
raise AssertionError("Unknown frequency: %s" % (stat.frequency,))
verify_UTC(fill_to_time)
if floor_to_hour(fill_to_time) != fill_to_time:
raise ValueError(f"fill_to_time must be on an hour boundary: {fill_to_time}")
raise ValueError("fill_to_time must be on an hour boundary: %s" % (fill_to_time,))
fill_state = FillState.objects.filter(property=stat.property).first()
if fill_state is None:
@@ -110,37 +87,37 @@ def process_count_stat(stat: CountStat, fill_to_time: datetime,
fill_state = FillState.objects.create(property=stat.property,
end_time=currently_filled,
state=FillState.DONE)
logger.info("INITIALIZED %s %s", stat.property, currently_filled)
logger.info("INITIALIZED %s %s" % (stat.property, currently_filled))
elif fill_state.state == FillState.STARTED:
logger.info("UNDO START %s %s", stat.property, fill_state.end_time)
logger.info("UNDO START %s %s" % (stat.property, fill_state.end_time))
do_delete_counts_at_hour(stat, fill_state.end_time)
currently_filled = fill_state.end_time - time_increment
do_update_fill_state(fill_state, currently_filled, FillState.DONE)
logger.info("UNDO DONE %s", stat.property)
logger.info("UNDO DONE %s" % (stat.property,))
elif fill_state.state == FillState.DONE:
currently_filled = fill_state.end_time
else:
raise AssertionError(f"Unknown value for FillState.state: {fill_state.state}.")
raise AssertionError("Unknown value for FillState.state: %s." % (fill_state.state,))
if isinstance(stat, DependentCountStat):
for dependency in stat.dependencies:
dependency_fill_time = last_successful_fill(dependency)
if dependency_fill_time is None:
logger.warning("DependentCountStat %s run before dependency %s.",
stat.property, dependency)
logger.warning("DependentCountStat %s run before dependency %s." %
(stat.property, dependency))
return
fill_to_time = min(fill_to_time, dependency_fill_time)
currently_filled = currently_filled + time_increment
while currently_filled <= fill_to_time:
logger.info("START %s %s", stat.property, currently_filled)
logger.info("START %s %s" % (stat.property, currently_filled))
start = time.time()
do_update_fill_state(fill_state, currently_filled, FillState.STARTED)
do_fill_count_stat_at_hour(stat, currently_filled, realm)
do_fill_count_stat_at_hour(stat, currently_filled)
do_update_fill_state(fill_state, currently_filled, FillState.DONE)
end = time.time()
currently_filled = currently_filled + time_increment
logger.info("DONE %s (%dms)", stat.property, (end-start)*1000)
logger.info("DONE %s (%dms)" % (stat.property, (end-start)*1000))
def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int) -> None:
fill_state.end_time = end_time
@@ -149,15 +126,15 @@ def do_update_fill_state(fill_state: FillState, end_time: datetime, state: int)
# We assume end_time is valid (e.g. is on a day or hour boundary as appropriate)
# and is timezone aware. It is the caller's responsibility to enforce this!
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime, realm: Optional[Realm]=None) -> None:
def do_fill_count_stat_at_hour(stat: CountStat, end_time: datetime) -> None:
start_time = end_time - stat.interval
if not isinstance(stat, LoggingCountStat):
timer = time.time()
assert(stat.data_collector.pull_function is not None)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time, realm)
logger.info("%s run pull_function (%dms/%sr)",
stat.property, (time.time()-timer)*1000, rows_added)
do_aggregate_to_summary_table(stat, end_time, realm)
rows_added = stat.data_collector.pull_function(stat.property, start_time, end_time)
logger.info("%s run pull_function (%dms/%sr)" %
(stat.property, (time.time()-timer)*1000, rows_added))
do_aggregate_to_summary_table(stat, end_time)
def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
if isinstance(stat, LoggingCountStat):
@@ -170,76 +147,51 @@ def do_delete_counts_at_hour(stat: CountStat, end_time: datetime) -> None:
RealmCount.objects.filter(property=stat.property, end_time=end_time).delete()
InstallationCount.objects.filter(property=stat.property, end_time=end_time).delete()
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime,
realm: Optional[Realm]=None) -> None:
def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime) -> None:
cursor = connection.cursor()
# Aggregate into RealmCount
output_table = stat.data_collector.output_table
if realm is not None:
realm_clause = SQL("AND zerver_realm.id = {}").format(Literal(realm.id))
else:
realm_clause = SQL("")
if output_table in (UserCount, StreamCount):
realmcount_query = SQL("""
realmcount_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, COALESCE(sum({output_table}.value), 0), %(property)s,
{output_table}.subgroup, %(end_time)s
zerver_realm.id, COALESCE(sum(%(output_table)s.value), 0), '%(property)s',
%(output_table)s.subgroup, %%(end_time)s
FROM zerver_realm
JOIN {output_table}
JOIN %(output_table)s
ON
zerver_realm.id = {output_table}.realm_id
zerver_realm.id = %(output_table)s.realm_id
WHERE
{output_table}.property = %(property)s AND
{output_table}.end_time = %(end_time)s
{realm_clause}
GROUP BY zerver_realm.id, {output_table}.subgroup
""").format(
output_table=Identifier(output_table._meta.db_table),
realm_clause=realm_clause,
)
%(output_table)s.property = '%(property)s' AND
%(output_table)s.end_time = %%(end_time)s
GROUP BY zerver_realm.id, %(output_table)s.subgroup
""" % {'output_table': output_table._meta.db_table,
'property': stat.property}
start = time.time()
cursor.execute(realmcount_query, {
'property': stat.property,
'end_time': end_time,
})
cursor.execute(realmcount_query, {'end_time': end_time})
end = time.time()
logger.info(
"%s RealmCount aggregation (%dms/%sr)",
stat.property, (end - start) * 1000, cursor.rowcount,
)
if realm is None:
# Aggregate into InstallationCount. Only run if we just
# processed counts for all realms.
#
# TODO: Add support for updating installation data after
# changing an individual realm's values.
installationcount_query = SQL("""
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), %(property)s, analytics_realmcount.subgroup, %(end_time)s
FROM analytics_realmcount
WHERE
property = %(property)s AND
end_time = %(end_time)s
GROUP BY analytics_realmcount.subgroup
""")
start = time.time()
cursor.execute(installationcount_query, {
'property': stat.property,
'end_time': end_time,
})
end = time.time()
logger.info(
"%s InstallationCount aggregation (%dms/%sr)",
stat.property, (end - start) * 1000, cursor.rowcount,
)
logger.info("%s RealmCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
# Aggregate into InstallationCount
installationcount_query = """
INSERT INTO analytics_installationcount
(value, property, subgroup, end_time)
SELECT
sum(value), '%(property)s', analytics_realmcount.subgroup, %%(end_time)s
FROM analytics_realmcount
WHERE
property = '%(property)s' AND
end_time = %%(end_time)s
GROUP BY analytics_realmcount.subgroup
""" % {'property': stat.property}
start = time.time()
cursor.execute(installationcount_query, {'end_time': end_time})
end = time.time()
logger.info("%s InstallationCount aggregation (%dms/%sr)" % (
stat.property, (end - start) * 1000, cursor.rowcount))
cursor.close()
## Utility functions called from outside counts.py ##
@@ -248,9 +200,6 @@ def do_aggregate_to_summary_table(stat: CountStat, end_time: datetime,
def do_increment_logging_stat(zerver_object: Union[Realm, UserProfile, Stream], stat: CountStat,
subgroup: Optional[Union[str, int, bool]], event_time: datetime,
increment: int=1) -> None:
if not increment:
return
table = stat.data_collector.output_table
if table == RealmCount:
id_args = {'realm': zerver_object}
@@ -277,6 +226,7 @@ def do_drop_all_analytics_tables() -> None:
RealmCount.objects.all().delete()
InstallationCount.objects.all().delete()
FillState.objects.all().delete()
Anomaly.objects.all().delete()
def do_drop_single_stat(property: str) -> None:
UserCount.objects.filter(property=property).delete()
@@ -287,71 +237,46 @@ def do_drop_single_stat(property: str) -> None:
## DataCollector-level operations ##
QueryFn = Callable[[Dict[str, Composable]], Composable]
def do_pull_by_sql_query(
property: str,
start_time: datetime,
end_time: datetime,
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> int:
def do_pull_by_sql_query(property: str, start_time: datetime, end_time: datetime, query: str,
group_by: Optional[Tuple[models.Model, str]]) -> int:
if group_by is None:
subgroup = SQL('NULL')
group_by_clause = SQL('')
subgroup = 'NULL'
group_by_clause = ''
else:
subgroup = Identifier(group_by[0]._meta.db_table, group_by[1])
group_by_clause = SQL(', {}').format(subgroup)
subgroup = '%s.%s' % (group_by[0]._meta.db_table, group_by[1])
group_by_clause = ', ' + subgroup
# We do string replacement here because cursor.execute will reject a
# group_by_clause given as a param.
# We pass in the datetimes as params to cursor.execute so that we don't have to
# think about how to convert python datetimes to SQL datetimes.
query_ = query({
'subgroup': subgroup,
'group_by_clause': group_by_clause,
})
query_ = query % {'property': property, 'subgroup': subgroup,
'group_by_clause': group_by_clause}
cursor = connection.cursor()
cursor.execute(query_, {
'property': property,
'time_start': start_time,
'time_end': end_time,
})
cursor.execute(query_, {'time_start': start_time, 'time_end': end_time})
rowcount = cursor.rowcount
cursor.close()
return rowcount
def sql_data_collector(
output_table: Type[BaseCount],
query: QueryFn,
group_by: Optional[Tuple[models.Model, str]],
) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime,
realm: Optional[Realm] = None) -> int:
# The pull function type needs to accept a Realm argument
# because the 'minutes_active::day' CountStat uses
# DataCollector directly for do_pull_minutes_active, which
# requires the realm argument. We ignore it here, because the
# realm should have been already encoded in the `query` we're
# passed.
def sql_data_collector(output_table: Type[BaseCount], query: str,
group_by: Optional[Tuple[models.Model, str]]) -> DataCollector:
def pull_function(property: str, start_time: datetime, end_time: datetime) -> int:
return do_pull_by_sql_query(property, start_time, end_time, query, group_by)
return DataCollector(output_table, pull_function)
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime,
realm: Optional[Realm] = None) -> int:
def do_pull_minutes_active(property: str, start_time: datetime, end_time: datetime) -> int:
user_activity_intervals = UserActivityInterval.objects.filter(
end__gt=start_time, start__lt=end_time,
end__gt=start_time, start__lt=end_time
).select_related(
'user_profile',
'user_profile'
).values_list(
'user_profile_id', 'user_profile__realm_id', 'start', 'end')
seconds_active: Dict[Tuple[int, int], float] = defaultdict(float)
seconds_active = defaultdict(float) # type: Dict[Tuple[int, int], float]
for user_id, realm_id, interval_start, interval_end in user_activity_intervals:
if realm is None or realm.id == realm_id:
start = max(start_time, interval_start)
end = min(end_time, interval_end)
seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
start = max(start_time, interval_start)
end = min(end_time, interval_end)
seconds_active[(user_id, realm_id)] += (end - start).total_seconds()
rows = [UserCount(user_id=ids[0], realm_id=ids[1], property=property,
end_time=end_time, value=int(seconds // 60))
@@ -359,39 +284,28 @@ def do_pull_minutes_active(property: str, start_time: datetime, end_time: dateti
UserCount.objects.bulk_create(rows)
return len(rows)
def count_message_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, count(*),
%(property)s, {subgroup}, %(time_end)s
'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile
JOIN zerver_message
ON
zerver_userprofile.id = zerver_message.sender_id
WHERE
zerver_userprofile.date_joined < %(time_end)s AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
GROUP BY zerver_userprofile.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_userprofile.date_joined < %%(time_end)s AND
zerver_message.pub_date >= %%(time_start)s AND
zerver_message.pub_date < %%(time_end)s
GROUP BY zerver_userprofile.id %(group_by_clause)s
"""
# Note: ignores the group_by / group_by_clause.
def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_type_by_user_query = """
INSERT INTO analytics_usercount
(realm_id, user_id, value, property, subgroup, end_time)
SELECT realm_id, id, SUM(count) AS value, %(property)s, message_type, %(time_end)s
SELECT realm_id, id, SUM(count) AS value, '%(property)s', message_type, %%(time_end)s
FROM
(
SELECT zerver_userprofile.realm_id, zerver_userprofile.id, count(*),
@@ -409,9 +323,8 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
JOIN zerver_message
ON
zerver_userprofile.id = zerver_message.sender_id AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
zerver_message.pub_date >= %%(time_start)s AND
zerver_message.pub_date < %%(time_end)s
JOIN zerver_recipient
ON
zerver_message.recipient_id = zerver_recipient.id
@@ -423,22 +336,17 @@ def count_message_type_by_user_query(realm: Optional[Realm]) -> QueryFn:
zerver_recipient.type, zerver_stream.invite_only
) AS subquery
GROUP BY realm_id, id, message_type
""").format(**kwargs, realm_clause=realm_clause)
"""
# This query joins to the UserProfile table since all current queries that
# use this also subgroup on UserProfile.is_bot. If in the future there is a
# stat that counts messages by stream and doesn't need the UserProfile
# table, consider writing a new query for efficiency.
def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_stream.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_message_by_stream_query = """
INSERT INTO analytics_streamcount
(stream_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_stream.id, zerver_stream.realm_id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_stream.id, zerver_stream.realm_id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_stream
JOIN zerver_recipient
ON
@@ -450,61 +358,48 @@ def count_message_by_stream_query(realm: Optional[Realm]) -> QueryFn:
ON
zerver_message.sender_id = zerver_userprofile.id
WHERE
zerver_stream.date_created < %(time_end)s AND
zerver_stream.date_created < %%(time_end)s AND
zerver_recipient.type = 2 AND
zerver_message.date_sent >= %(time_start)s AND
{realm_clause}
zerver_message.date_sent < %(time_end)s
GROUP BY zerver_stream.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_message.pub_date >= %%(time_start)s AND
zerver_message.pub_date < %%(time_end)s
GROUP BY zerver_stream.id %(group_by_clause)s
"""
# Hardcodes the query needed by active_users:is_bot:day, since that is
# currently the only stat that uses this.
def count_user_by_realm_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_user_by_realm_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_realm.id, count(*),'%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm
JOIN zerver_userprofile
ON
zerver_realm.id = zerver_userprofile.realm_id
WHERE
zerver_realm.date_created < %(time_end)s AND
zerver_userprofile.date_joined >= %(time_start)s AND
zerver_userprofile.date_joined < %(time_end)s AND
{realm_clause}
zerver_realm.date_created < %%(time_end)s AND
zerver_userprofile.date_joined >= %%(time_start)s AND
zerver_userprofile.date_joined < %%(time_end)s AND
zerver_userprofile.is_active = TRUE
GROUP BY zerver_realm.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
GROUP BY zerver_realm.id %(group_by_clause)s
"""
# Currently hardcodes the query needed for active_users_audit:is_bot:day.
# Assumes that a user cannot have two RealmAuditLog entries with the same event_time and
# event_type in [RealmAuditLog.USER_CREATED, USER_DEACTIVATED, etc].
# event_type in ['user_created', 'user_deactivated', etc].
# In particular, it's important to ensure that migrations don't cause that to happen.
def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
check_realmauditlog_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
ral1.modified_user_id, ral1.realm_id, 1, %(property)s, {subgroup}, %(time_end)s
ral1.modified_user_id, ral1.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realmauditlog ral1
JOIN (
SELECT modified_user_id, max(event_time) AS max_event_time
FROM zerver_realmauditlog
WHERE
event_type in ({user_created}, {user_activated}, {user_deactivated}, {user_reactivated}) AND
{realm_clause}
event_time < %(time_end)s
event_type in ('user_created', 'user_deactivated', 'user_activated', 'user_reactivated') AND
event_time < %%(time_end)s
GROUP BY modified_user_id
) ral2
ON
@@ -514,181 +409,131 @@ def check_realmauditlog_by_user_query(realm: Optional[Realm]) -> QueryFn:
ON
ral1.modified_user_id = zerver_userprofile.id
WHERE
ral1.event_type in ({user_created}, {user_activated}, {user_reactivated})
""").format(
**kwargs,
user_created=Literal(RealmAuditLog.USER_CREATED),
user_activated=Literal(RealmAuditLog.USER_ACTIVATED),
user_deactivated=Literal(RealmAuditLog.USER_DEACTIVATED),
user_reactivated=Literal(RealmAuditLog.USER_REACTIVATED),
realm_clause=realm_clause,
)
ral1.event_type in ('user_created', 'user_activated', 'user_reactivated')
"""
def check_useractivityinterval_by_user_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("zerver_userprofile.realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
check_useractivityinterval_by_user_query = """
INSERT INTO analytics_usercount
(user_id, realm_id, value, property, subgroup, end_time)
SELECT
zerver_userprofile.id, zerver_userprofile.realm_id, 1, %(property)s, {subgroup}, %(time_end)s
zerver_userprofile.id, zerver_userprofile.realm_id, 1, '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_userprofile
JOIN zerver_useractivityinterval
ON
zerver_userprofile.id = zerver_useractivityinterval.user_profile_id
WHERE
zerver_useractivityinterval.end >= %(time_start)s AND
{realm_clause}
zerver_useractivityinterval.start < %(time_end)s
GROUP BY zerver_userprofile.id {group_by_clause}
""").format(**kwargs, realm_clause=realm_clause)
zerver_useractivityinterval.end >= %%(time_start)s AND
zerver_useractivityinterval.start < %%(time_end)s
GROUP BY zerver_userprofile.id %(group_by_clause)s
"""
def count_realm_active_humans_query(realm: Optional[Realm]) -> QueryFn:
if realm is None:
realm_clause = SQL("")
else:
realm_clause = SQL("realm_id = {} AND").format(Literal(realm.id))
return lambda kwargs: SQL("""
count_realm_active_humans_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
usercount1.realm_id, count(*), %(property)s, NULL, %(time_end)s
usercount1.realm_id, count(*), '%(property)s', NULL, %%(time_end)s
FROM (
SELECT realm_id, user_id
FROM analytics_usercount
WHERE
property = 'active_users_audit:is_bot:day' AND
subgroup = 'false' AND
{realm_clause}
end_time = %(time_end)s
end_time = %%(time_end)s
) usercount1
JOIN (
SELECT realm_id, user_id
FROM analytics_usercount
WHERE
property = '15day_actives::day' AND
{realm_clause}
end_time = %(time_end)s
end_time = %%(time_end)s
) usercount2
ON
usercount1.user_id = usercount2.user_id
GROUP BY usercount1.realm_id
""").format(**kwargs, realm_clause=realm_clause)
"""
# Currently unused and untested
count_stream_by_realm_query = lambda kwargs: SQL("""
count_stream_by_realm_query = """
INSERT INTO analytics_realmcount
(realm_id, value, property, subgroup, end_time)
SELECT
zerver_realm.id, count(*), %(property)s, {subgroup}, %(time_end)s
zerver_realm.id, count(*), '%(property)s', %(subgroup)s, %%(time_end)s
FROM zerver_realm
JOIN zerver_stream
ON
zerver_realm.id = zerver_stream.realm_id AND
WHERE
zerver_realm.date_created < %(time_end)s AND
zerver_stream.date_created >= %(time_start)s AND
zerver_stream.date_created < %(time_end)s
GROUP BY zerver_realm.id {group_by_clause}
""").format(**kwargs)
zerver_realm.date_created < %%(time_end)s AND
zerver_stream.date_created >= %%(time_start)s AND
zerver_stream.date_created < %%(time_end)s
GROUP BY zerver_realm.id %(group_by_clause)s
"""
def get_count_stats(realm: Optional[Realm]=None) -> Dict[str, CountStat]:
## CountStat declarations ##
## CountStat declarations ##
count_stats_ = [
# Messages Sent stats
# Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table.
count_stats_ = [
# Messages Sent stats
# Stats that count the number of messages sent in various ways.
# These are also the set of stats that read from the Message table.
CountStat('messages_sent:is_bot:hour',
sql_data_collector(UserCount, count_message_by_user_query(
realm), (UserProfile, 'is_bot')),
CountStat.HOUR),
CountStat('messages_sent:message_type:day',
sql_data_collector(
UserCount, count_message_type_by_user_query(realm), None),
CountStat.DAY),
CountStat('messages_sent:client:day',
sql_data_collector(UserCount, count_message_by_user_query(realm),
(Message, 'sending_client_id')), CountStat.DAY),
CountStat('messages_in_stream:is_bot:day',
sql_data_collector(StreamCount, count_message_by_stream_query(realm),
(UserProfile, 'is_bot')), CountStat.DAY),
CountStat('messages_sent:is_bot:hour',
sql_data_collector(UserCount, count_message_by_user_query, (UserProfile, 'is_bot')),
CountStat.HOUR),
CountStat('messages_sent:message_type:day',
sql_data_collector(UserCount, count_message_type_by_user_query, None), CountStat.DAY),
CountStat('messages_sent:client:day',
sql_data_collector(UserCount, count_message_by_user_query, (Message, 'sending_client_id')),
CountStat.DAY),
CountStat('messages_in_stream:is_bot:day',
sql_data_collector(StreamCount, count_message_by_stream_query, (UserProfile, 'is_bot')),
CountStat.DAY),
# Number of Users stats
# Stats that count the number of active users in the UserProfile.is_active sense.
# Number of Users stats
# Stats that count the number of active users in the UserProfile.is_active sense.
# 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat('active_users_audit:is_bot:day',
sql_data_collector(UserCount, check_realmauditlog_by_user_query(
realm), (UserProfile, 'is_bot')),
CountStat.DAY),
# 'active_users_audit:is_bot:day' is the canonical record of which users were
# active on which days (in the UserProfile.is_active sense).
# Important that this stay a daily stat, so that 'realm_active_humans::day' works as expected.
CountStat('active_users_audit:is_bot:day',
sql_data_collector(UserCount, check_realmauditlog_by_user_query, (UserProfile, 'is_bot')),
CountStat.DAY),
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced.
LoggingCountStat('active_users_log:is_bot:day', RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query, (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
# Important note: LoggingCountStat objects aren't passed the
# Realm argument, because by nature they have a logging
# structure, not a pull-from-database structure, so there's no
# way to compute them for a single realm after the fact (the
# use case for passing a Realm argument).
# User Activity stats
# Stats that measure user activity in the UserActivityInterval sense.
# Sanity check on 'active_users_audit:is_bot:day', and a archetype for future LoggingCountStats.
# In RealmCount, 'active_users_audit:is_bot:day' should be the partial
# sum sequence of 'active_users_log:is_bot:day', for any realm that
# started after the latter stat was introduced.
LoggingCountStat('active_users_log:is_bot:day',
RealmCount, CountStat.DAY),
# Another sanity check on 'active_users_audit:is_bot:day'. Is only an
# approximation, e.g. if a user is deactivated between the end of the
# day and when this stat is run, they won't be counted. However, is the
# simplest of the three to inspect by hand.
CountStat('active_users:is_bot:day',
sql_data_collector(RealmCount, count_user_by_realm_query(realm), (UserProfile, 'is_bot')),
CountStat.DAY, interval=TIMEDELTA_MAX),
CountStat('1day_actives::day',
sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('15day_actives::day',
sql_data_collector(UserCount, check_useractivityinterval_by_user_query, None),
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(UserCount, do_pull_minutes_active), CountStat.DAY),
# Messages read stats. messages_read::hour is the total
# number of messages read, whereas
# messages_read_interactions::hour tries to count the total
# number of UI interactions resulting in messages being marked
# as read (imperfect because of batching of some request
# types, but less likely to be overwhelmed by a single bulk
# operation).
LoggingCountStat('messages_read::hour', UserCount, CountStat.HOUR),
LoggingCountStat('messages_read_interactions::hour', UserCount, CountStat.HOUR),
# Rate limiting stats
# User Activity stats
# Stats that measure user activity in the UserActivityInterval sense.
# Used to limit the number of invitation emails sent by a realm
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
CountStat('1day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=1)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('15day_actives::day',
sql_data_collector(
UserCount, check_useractivityinterval_by_user_query(realm), None),
CountStat.DAY, interval=timedelta(days=15)-UserActivityInterval.MIN_INTERVAL_LENGTH),
CountStat('minutes_active::day', DataCollector(
UserCount, do_pull_minutes_active), CountStat.DAY),
# Dependent stats
# Must come after their dependencies.
# Rate limiting stats
# Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(RealmCount, count_realm_active_humans_query, None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day'])
]
# Used to limit the number of invitation emails sent by a realm
LoggingCountStat('invites_sent::day', RealmCount, CountStat.DAY),
# Dependent stats
# Must come after their dependencies.
# Canonical account of the number of active humans in a realm on each day.
DependentCountStat('realm_active_humans::day',
sql_data_collector(
RealmCount, count_realm_active_humans_query(realm), None),
CountStat.DAY,
dependencies=['active_users_audit:is_bot:day', '15day_actives::day']),
]
return OrderedDict([(stat.property, stat) for stat in count_stats_])
# To avoid refactoring for now COUNT_STATS can be used as before
COUNT_STATS = get_count_stats()
COUNT_STATS = OrderedDict([(stat.property, stat) for stat in count_stats_])

View File

@@ -4,7 +4,6 @@ from typing import List
from analytics.lib.counts import CountStat
def generate_time_series_data(days: int=100, business_hours_base: float=10,
non_business_hours_base: float=10, growth: float=1,
autocorrelation: float=0, spikiness: float=1,
@@ -44,10 +43,10 @@ def generate_time_series_data(days: int=100, business_hours_base: float=10,
[24*non_business_hours_base] * 2
holidays = [random() < holiday_rate for i in range(days)]
else:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
if length < 2:
raise AssertionError("Must be generating at least 2 data points. "
f"Currently generating {length}")
"Currently generating %s" % (length,))
growth_base = growth ** (1. / (length-1))
values_no_noise = [seasonality[i % len(seasonality)] * (growth_base**i) for i in range(length)]

View File

@@ -4,7 +4,6 @@ from typing import List, Optional
from analytics.lib.counts import CountStat
from zerver.lib.timestamp import floor_to_day, floor_to_hour, verify_UTC
# If min_length is None, returns end_times from ceiling(start) to floor(end), inclusive.
# If min_length is greater than 0, pads the list to the left.
# So informally, time_range(Sep 20, Sep 22, day, None) returns [Sep 20, Sep 21, Sep 22],
@@ -20,7 +19,7 @@ def time_range(start: datetime, end: datetime, frequency: str,
end = floor_to_day(end)
step = timedelta(days=1)
else:
raise AssertionError(f"Unknown frequency: {frequency}")
raise AssertionError("Unknown frequency: %s" % (frequency,))
times = []
if min_length is not None:

View File

@@ -8,7 +8,6 @@ from django.core.management.base import BaseCommand, CommandParser
from zerver.lib.timestamp import timestamp_to_datetime
from zerver.models import Message, Recipient
def compute_stats(log_level: int) -> None:
logger = logging.getLogger()
logger.setLevel(log_level)
@@ -16,7 +15,7 @@ def compute_stats(log_level: int) -> None:
one_week_ago = timestamp_to_datetime(time.time()) - datetime.timedelta(weeks=1)
mit_query = Message.objects.filter(sender__realm__string_id="zephyr",
recipient__type=Recipient.STREAM,
date_sent__gt=one_week_ago)
pub_date__gt=one_week_ago)
for bot_sender_start in ["imap.", "rcmd.", "sys."]:
mit_query = mit_query.exclude(sender__email__startswith=(bot_sender_start))
# Filtering for "/" covers tabbott/extra@ and all the daemon/foo bots.
@@ -27,15 +26,15 @@ def compute_stats(log_level: int) -> None:
"bitcoin@mit.edu", "lp@mit.edu", "clocks@mit.edu",
"root@mit.edu", "nagios@mit.edu",
"www-data|local-realm@mit.edu"])
user_counts: Dict[str, Dict[str, int]] = {}
user_counts = {} # type: Dict[str, Dict[str, int]]
for m in mit_query.select_related("sending_client", "sender"):
email = m.sender.email
user_counts.setdefault(email, {})
user_counts[email].setdefault(m.sending_client.name, 0)
user_counts[email][m.sending_client.name] += 1
total_counts: Dict[str, int] = {}
total_user_counts: Dict[str, int] = {}
total_counts = {} # type: Dict[str, int]
total_user_counts = {} # type: Dict[str, int]
for email, counts in user_counts.items():
total_user_counts.setdefault(email, 0)
for client_name, count in counts.items():
@@ -43,8 +42,8 @@ def compute_stats(log_level: int) -> None:
total_counts[client_name] += count
total_user_counts[email] += count
logging.debug("%40s | %10s | %s", "User", "Messages", "Percentage Zulip")
top_percents: Dict[int, float] = {}
logging.debug("%40s | %10s | %s" % ("User", "Messages", "Percentage Zulip"))
top_percents = {} # type: Dict[int, float]
for size in [10, 25, 50, 100, 200, len(total_user_counts.keys())]:
top_percents[size] = 0.0
for i, email in enumerate(sorted(total_user_counts.keys(),
@@ -56,18 +55,18 @@ def compute_stats(log_level: int) -> None:
if i < size:
top_percents[size] += (percent_zulip * 1.0 / size)
logging.debug("%40s | %10s | %s%%", email, total_user_counts[email],
percent_zulip)
logging.debug("%40s | %10s | %s%%" % (email, total_user_counts[email],
percent_zulip))
logging.info("")
for size in sorted(top_percents.keys()):
logging.info("Top %6s | %s%%", size, round(top_percents[size], 1))
logging.info("Top %6s | %s%%" % (size, round(top_percents[size], 1)))
grand_total = sum(total_counts.values())
print(grand_total)
logging.info("%15s | %s", "Client", "Percentage")
logging.info("%15s | %s" % ("Client", "Percentage"))
for client in total_counts.keys():
logging.info("%15s | %s%%", client, round(100. * total_counts[client] / grand_total, 1))
logging.info("%15s | %s%%" % (client, round(100. * total_counts[client] / grand_total, 1)))
class Command(BaseCommand):
help = "Compute statistics on MIT Zephyr usage."

View File

@@ -2,13 +2,13 @@ import datetime
from typing import Any, Dict
from django.core.management.base import BaseCommand, CommandParser
from django.utils.timezone import utc
from zerver.lib.statistics import seconds_usage_between
from zerver.models import UserProfile
def analyze_activity(options: Dict[str, Any]) -> None:
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=datetime.timezone.utc)
day_start = datetime.datetime.strptime(options["date"], "%Y-%m-%d").replace(tzinfo=utc)
day_end = day_start + datetime.timedelta(days=options["duration"])
user_profile_query = UserProfile.objects.all()
@@ -24,11 +24,11 @@ def analyze_activity(options: Dict[str, Any]) -> None:
continue
total_duration += duration
print(f"{user_profile.email:<37}{duration}")
print("%-*s%s" % (37, user_profile.email, duration,))
print(f"\nTotal Duration: {total_duration}")
print(f"\nTotal Duration in minutes: {total_duration.total_seconds() / 60.}")
print(f"Total Duration amortized to a month: {total_duration.total_seconds() * 30. / 60.}")
print("\nTotal Duration: %s" % (total_duration,))
print("\nTotal Duration in minutes: %s" % (total_duration.total_seconds() / 60.,))
print("Total Duration amortized to a month: %s" % (total_duration.total_seconds() * 30. / 60.,))
class Command(BaseCommand):
help = """Report analytics of user activity on a per-user and realm basis.

View File

@@ -1,21 +1,26 @@
import os
import time
from argparse import ArgumentParser
from datetime import timedelta
from typing import Any, Dict
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.models import InstallationCount, installation_epoch, \
last_successful_fill
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.models import installation_epoch, last_successful_fill
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day, floor_to_hour, verify_UTC
from zerver.lib.timestamp import floor_to_hour, floor_to_day, verify_UTC, \
TimezoneNotUTCException
from zerver.models import Realm
import os
import sys
import time
from typing import Any, Dict
states = {
0: "OK",
1: "WARNING",
2: "CRITICAL",
3: "UNKNOWN",
3: "UNKNOWN"
}
class Command(BaseCommand):
@@ -32,7 +37,8 @@ class Command(BaseCommand):
state_file_tmp = state_file_path + "-tmp"
with open(state_file_tmp, "w") as f:
f.write(f"{int(time.time())}|{status}|{states[status]}|{message}\n")
f.write("%s|%s|%s|%s\n" % (
int(time.time()), status, states[status], message))
os.rename(state_file_tmp, state_file_path)
def get_fill_state(self) -> Dict[str, Any]:
@@ -48,7 +54,7 @@ class Command(BaseCommand):
try:
verify_UTC(last_fill)
except TimezoneNotUTCException:
return {'status': 2, 'message': f'FillState not in UTC for {property}'}
return {'status': 2, 'message': 'FillState not in UTC for %s' % (property,)}
if stat.frequency == CountStat.DAY:
floor_function = floor_to_day
@@ -60,7 +66,8 @@ class Command(BaseCommand):
critical_threshold = timedelta(minutes=150)
if floor_function(last_fill) != last_fill:
return {'status': 2, 'message': f'FillState not on {stat.frequency} boundary for {property}'}
return {'status': 2, 'message': 'FillState not on %s boundary for %s' %
(stat.frequency, property)}
time_to_last_fill = timezone_now() - last_fill
if time_to_last_fill > critical_threshold:
@@ -71,16 +78,7 @@ class Command(BaseCommand):
if len(critical_unfilled_properties) == 0 and len(warning_unfilled_properties) == 0:
return {'status': 0, 'message': 'FillState looks fine.'}
if len(critical_unfilled_properties) == 0:
return {
'status': 1,
'message': 'Missed filling {} once.'.format(
', '.join(warning_unfilled_properties),
),
}
return {
'status': 2,
'message': 'Missed filling {} once. Missed filling {} at least twice.'.format(
', '.join(warning_unfilled_properties),
', '.join(critical_unfilled_properties),
),
}
return {'status': 1, 'message': 'Missed filling %s once.' %
(', '.join(warning_unfilled_properties),)}
return {'status': 2, 'message': 'Missed filling %s once. Missed filling %s at least twice.' %
(', '.join(warning_unfilled_properties), ', '.join(critical_unfilled_properties))}

View File

@@ -1,11 +1,11 @@
import sys
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand, CommandError
from django.core.management.base import BaseCommand
from analytics.lib.counts import do_drop_all_analytics_tables
class Command(BaseCommand):
help = """Clear analytics tables."""
@@ -18,4 +18,5 @@ class Command(BaseCommand):
if options['force']:
do_drop_all_analytics_tables()
else:
raise CommandError("Would delete all data from analytics tables (!); use --force to do so.")
print("Would delete all data from analytics tables (!); use --force to do so.")
sys.exit(1)

View File

@@ -1,11 +1,11 @@
import sys
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand, CommandError
from django.core.management.base import BaseCommand
from analytics.lib.counts import COUNT_STATS, do_drop_single_stat
class Command(BaseCommand):
help = """Clear analytics tables."""
@@ -20,8 +20,10 @@ class Command(BaseCommand):
def handle(self, *args: Any, **options: Any) -> None:
property = options['property']
if property not in COUNT_STATS:
raise CommandError(f"Invalid property: {property}")
print("Invalid property: %s" % (property,))
sys.exit(1)
if not options['force']:
raise CommandError("No action taken. Use --force.")
print("No action taken. Use --force.")
sys.exit(1)
do_drop_single_stat(property)

View File

@@ -1,6 +1,6 @@
import datetime
from argparse import ArgumentParser
from typing import Any, Optional
from typing import Any
from django.db.models import Count, QuerySet
from django.utils.timezone import now as timezone_now
@@ -8,7 +8,6 @@ from django.utils.timezone import now as timezone_now
from zerver.lib.management import ZulipBaseCommand
from zerver.models import UserActivity
class Command(ZulipBaseCommand):
help = """Report rough client activity globally, for a realm, or for a user
@@ -54,10 +53,10 @@ Usage examples:
counts.sort()
for count in counts:
print(f"{count[1]:>25} {count[0]:15}")
print("%25s %15d" % (count[1], count[0]))
print("Total:", total)
def handle(self, *args: Any, **options: Optional[str]) -> None:
def handle(self, *args: Any, **options: str) -> None:
realm = self.get_realm(options)
if options["user"] is None:
if options["target"] == "server" and realm is None:

View File

@@ -1,26 +1,20 @@
from datetime import timedelta
from typing import Any, Dict, List, Mapping, Optional, Type
from unittest import mock
from datetime import datetime, timedelta
from typing import Any, Dict, List, Mapping, Optional, Type, Union
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from analytics.lib.counts import COUNT_STATS, CountStat, do_drop_all_analytics_tables
from analytics.lib.counts import COUNT_STATS, \
CountStat, do_drop_all_analytics_tables
from analytics.lib.fixtures import generate_time_series_data
from analytics.lib.time_utils import time_range
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
)
from zerver.lib.actions import STREAM_ASSIGNMENT_COLORS, do_change_user_role
from zerver.lib.create_user import create_user
from analytics.models import BaseCount, FillState, RealmCount, UserCount, \
StreamCount, InstallationCount
from zerver.lib.actions import do_change_is_admin
from zerver.lib.timestamp import floor_to_day
from zerver.models import Client, Realm, Recipient, Stream, Subscription, UserProfile
from zerver.models import Realm, UserProfile, Stream, Message, Client, \
RealmAuditLog, Recipient
class Command(BaseCommand):
help = """Populates analytics tables with randomly generated data."""
@@ -28,6 +22,20 @@ class Command(BaseCommand):
DAYS_OF_DATA = 100
random_seed = 26
def create_user(self, email: str,
full_name: str,
is_staff: bool,
date_joined: datetime,
realm: Realm) -> UserProfile:
user = UserProfile.objects.create(
email=email, full_name=full_name, is_staff=is_staff,
realm=realm, short_name=full_name, pointer=-1, last_pointer_updater='none',
api_key='42', date_joined=date_joined)
RealmAuditLog.objects.create(
realm=realm, modified_user=user, event_type=RealmAuditLog.USER_CREATED,
event_time=user.date_joined)
return user
def generate_fixture_data(self, stat: CountStat, business_hours_base: float,
non_business_hours_base: float, growth: float,
autocorrelation: float, spikiness: float,
@@ -60,25 +68,11 @@ class Command(BaseCommand):
last_end_time = floor_to_day(timezone_now())
realm = Realm.objects.create(
string_id='analytics', name='Analytics', date_created=installation_time)
with mock.patch("zerver.lib.create_user.timezone_now", return_value=installation_time):
shylock = create_user('shylock@analytics.ds', 'Shylock', realm,
full_name='Shylock', short_name='shylock',
role=UserProfile.ROLE_REALM_ADMINISTRATOR)
do_change_user_role(shylock, UserProfile.ROLE_REALM_ADMINISTRATOR, acting_user=None)
shylock = self.create_user('shylock@analytics.ds', 'Shylock', True, installation_time, realm)
do_change_is_admin(shylock, True)
stream = Stream.objects.create(
name='all', realm=realm, date_created=installation_time)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
# Subscribe shylock to the stream to avoid invariant failures.
# TODO: This should use subscribe_users_to_streams from populate_db.
subs = [
Subscription(recipient=recipient,
user_profile=shylock,
color=STREAM_ASSIGNMENT_COLORS[0]),
]
Subscription.objects.bulk_create(subs)
Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
def insert_fixture_data(stat: CountStat,
fixture_data: Mapping[Optional[str], List[int]],
@@ -86,7 +80,7 @@ class Command(BaseCommand):
end_times = time_range(last_end_time, last_end_time, stat.frequency,
len(list(fixture_data.values())[0]))
if table == InstallationCount:
id_args: Dict[str, Any] = {}
id_args = {} # type: Dict[str, Any]
if table == RealmCount:
id_args = {'realm': realm}
if table == UserCount:
@@ -101,13 +95,13 @@ class Command(BaseCommand):
for end_time, value in zip(end_times, values) if value != 0])
stat = COUNT_STATS['1day_actives::day']
realm_data: Mapping[Optional[str], List[int]] = {
realm_data = {
None: self.generate_fixture_data(stat, .08, .02, 3, .3, 6, partial_sum=True),
}
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, realm_data, RealmCount)
installation_data: Mapping[Optional[str], List[int]] = {
installation_data = {
None: self.generate_fixture_data(stat, .8, .2, 4, .3, 6, partial_sum=True),
}
} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, installation_data, InstallationCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
@@ -137,9 +131,8 @@ class Command(BaseCommand):
state=FillState.DONE)
stat = COUNT_STATS['messages_sent:is_bot:hour']
user_data: Mapping[Optional[str], List[int]] = {
'false': self.generate_fixture_data(stat, 2, 1, 1.5, .6, 8, holiday_rate=.1),
}
user_data = {'false': self.generate_fixture_data(
stat, 2, 1, 1.5, .6, 8, holiday_rate=.1)} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, user_data, UserCount)
realm_data = {'false': self.generate_fixture_data(stat, 35, 15, 6, .6, 4),
'true': self.generate_fixture_data(stat, 15, 15, 3, .4, 2)}
@@ -215,22 +208,8 @@ class Command(BaseCommand):
realm_data = {'false': self.generate_fixture_data(stat, 30, 5, 6, .6, 4),
'true': self.generate_fixture_data(stat, 20, 2, 3, .2, 3)}
insert_fixture_data(stat, realm_data, RealmCount)
stream_data: Mapping[Optional[str], List[int]] = {
'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2),
}
stream_data = {'false': self.generate_fixture_data(stat, 10, 7, 5, .6, 4),
'true': self.generate_fixture_data(stat, 5, 3, 2, .4, 2)} # type: Mapping[Optional[str], List[int]]
insert_fixture_data(stat, stream_data, StreamCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)
stat = COUNT_STATS['messages_read::hour']
user_data = {
None: self.generate_fixture_data(stat, 7, 3, 2, .6, 8, holiday_rate=.1),
}
insert_fixture_data(stat, user_data, UserCount)
realm_data = {
None: self.generate_fixture_data(stat, 50, 35, 6, .6, 4)
}
insert_fixture_data(stat, realm_data, RealmCount)
FillState.objects.create(property=stat.property, end_time=last_end_time,
state=FillState.DONE)

View File

@@ -2,21 +2,13 @@ import datetime
from argparse import ArgumentParser
from typing import Any, List
from django.core.management.base import BaseCommand, CommandError
import pytz
from django.core.management.base import BaseCommand
from django.db.models import Count
from django.utils.timezone import now as timezone_now
from zerver.models import (
Message,
Realm,
Recipient,
Stream,
Subscription,
UserActivity,
UserMessage,
UserProfile,
get_realm,
)
from zerver.models import Message, Realm, Recipient, Stream, \
Subscription, UserActivity, UserMessage, UserProfile, get_realm
MOBILE_CLIENT_LIST = ["Android", "ios"]
HUMAN_CLIENT_LIST = MOBILE_CLIENT_LIST + ["website"]
@@ -42,32 +34,32 @@ class Command(BaseCommand):
def messages_sent_by(self, user: UserProfile, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender=user, date_sent__gt=sent_time_cutoff).count()
return human_messages.filter(sender=user, pub_date__gt=sent_time_cutoff).count()
def total_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return Message.objects.filter(sender__realm=realm, date_sent__gt=sent_time_cutoff).count()
return Message.objects.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def human_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, date_sent__gt=sent_time_cutoff).count()
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).count()
def api_messages(self, realm: Realm, days_ago: int) -> int:
return (self.total_messages(realm, days_ago) - self.human_messages(realm, days_ago))
def stream_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, date_sent__gt=sent_time_cutoff,
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff,
recipient__type=Recipient.STREAM).count()
def private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, date_sent__gt=sent_time_cutoff).exclude(
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.HUDDLE).count()
def group_private_messages(self, realm: Realm, days_ago: int) -> int:
sent_time_cutoff = timezone_now() - datetime.timedelta(days=days_ago)
return human_messages.filter(sender__realm=realm, date_sent__gt=sent_time_cutoff).exclude(
return human_messages.filter(sender__realm=realm, pub_date__gt=sent_time_cutoff).exclude(
recipient__type=Recipient.STREAM).exclude(recipient__type=Recipient.PERSONAL).count()
def report_percentage(self, numerator: float, denominator: float, text: str) -> None:
@@ -75,14 +67,15 @@ class Command(BaseCommand):
fraction = 0.0
else:
fraction = numerator / float(denominator)
print(f"{fraction * 100:.2f}% of", text)
print("%.2f%% of" % (fraction * 100,), text)
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
raise CommandError(e)
print(e)
exit(1)
else:
realms = Realm.objects.all()
@@ -93,26 +86,26 @@ class Command(BaseCommand):
active_users = self.active_users(realm)
num_active = len(active_users)
print(f"{num_active} active users ({len(user_profiles)} total)")
print("%d active users (%d total)" % (num_active, len(user_profiles)))
streams = Stream.objects.filter(realm=realm).extra(
tables=['zerver_subscription', 'zerver_recipient'],
where=['zerver_subscription.recipient_id = zerver_recipient.id',
'zerver_recipient.type = 2',
'zerver_recipient.type_id = zerver_stream.id',
'zerver_subscription.active = true']).annotate(count=Count("name"))
print(f"{streams.count()} streams")
print("%d streams" % (streams.count(),))
for days_ago in (1, 7, 30):
print(f"In last {days_ago} days, users sent:")
print("In last %d days, users sent:" % (days_ago,))
sender_quantities = [self.messages_sent_by(user, days_ago) for user in user_profiles]
for quantity in sorted(sender_quantities, reverse=True):
print(quantity, end=' ')
print("")
print(f"{self.stream_messages(realm, days_ago)} stream messages")
print(f"{self.private_messages(realm, days_ago)} one-on-one private messages")
print(f"{self.api_messages(realm, days_ago)} messages sent via the API")
print(f"{self.group_private_messages(realm, days_ago)} group private messages")
print("%d stream messages" % (self.stream_messages(realm, days_ago),))
print("%d one-on-one private messages" % (self.private_messages(realm, days_ago),))
print("%d messages sent via the API" % (self.api_messages(realm, days_ago),))
print("%d group private messages" % (self.group_private_messages(realm, days_ago),))
num_notifications_enabled = len([x for x in active_users if x.enable_desktop_notifications])
self.report_percentage(num_notifications_enabled, num_active,
@@ -132,29 +125,29 @@ class Command(BaseCommand):
starrers = UserMessage.objects.filter(user_profile__in=user_profiles,
flags=UserMessage.flags.starred).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users have starred {} messages".format(
print("%d users have starred %d messages" % (
len(starrers), sum([elt["count"] for elt in starrers])))
active_user_subs = Subscription.objects.filter(
user_profile__in=user_profiles, active=True)
# Streams not in home view
non_home_view = active_user_subs.filter(is_muted=True).values(
non_home_view = active_user_subs.filter(in_home_view=False).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users have {} streams not in home view".format(
print("%d users have %d streams not in home view" % (
len(non_home_view), sum([elt["count"] for elt in non_home_view])))
# Code block markup
markup_messages = human_messages.filter(
sender__realm=realm, content__contains="~~~").values(
"sender").annotate(count=Count("sender"))
print("{} users have used code block markup on {} messages".format(
print("%d users have used code block markup on %s messages" % (
len(markup_messages), sum([elt["count"] for elt in markup_messages])))
# Notifications for stream messages
notifications = active_user_subs.filter(desktop_notifications=True).values(
"user_profile").annotate(count=Count("user_profile"))
print("{} users receive desktop notifications for {} streams".format(
print("%d users receive desktop notifications for %d streams" % (
len(notifications), sum([elt["count"] for elt in notifications])))
print("")

View File

@@ -1,11 +1,11 @@
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand, CommandError
from django.core.management.base import BaseCommand
from django.db.models import Q
from zerver.models import Message, Realm, Recipient, Stream, Subscription, get_realm
from zerver.models import Message, Realm, \
Recipient, Stream, Subscription, get_realm
class Command(BaseCommand):
help = "Generate statistics on the streams for a realm."
@@ -19,38 +19,26 @@ class Command(BaseCommand):
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
raise CommandError(e)
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
print("------------")
print("%25s %15s %10s" % ("stream", "subscribers", "messages"))
streams = Stream.objects.filter(realm=realm).exclude(Q(name__istartswith="tutorial-"))
# private stream count
private_count = 0
# public stream count
public_count = 0
invite_only_count = 0
for stream in streams:
if stream.invite_only:
private_count += 1
else:
public_count += 1
print("------------")
print(realm.string_id, end=' ')
print("{:>10} {} public streams and".format("(", public_count), end=' ')
print(f"{private_count} private streams )")
print("------------")
print("{:>25} {:>15} {:>10} {:>12}".format("stream", "subscribers", "messages", "type"))
for stream in streams:
if stream.invite_only:
stream_type = 'private'
else:
stream_type = 'public'
print(f"{stream.name:>25}", end=' ')
invite_only_count += 1
continue
print("%25s" % (stream.name,), end=' ')
recipient = Recipient.objects.filter(type=Recipient.STREAM, type_id=stream.id)
print("{:10}".format(len(Subscription.objects.filter(recipient=recipient,
active=True))), end=' ')
print("%10d" % (len(Subscription.objects.filter(recipient=recipient,
active=True)),), end=' ')
num_messages = len(Message.objects.filter(recipient=recipient))
print(f"{num_messages:12}", end=' ')
print(f"{stream_type:>15}")
print("%12d" % (num_messages,))
print("%d private streams" % (invite_only_count,))
print("")

View File

@@ -1,21 +1,19 @@
import os
import time
from argparse import ArgumentParser
from datetime import timezone
from typing import Any, Dict
from django.conf import settings
from django.core.management.base import BaseCommand
from django.utils.dateparse import parse_datetime
from django.utils.timezone import now as timezone_now
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import COUNT_STATS, logger, process_count_stat
from scripts.lib.zulip_tools import ENDC, WARNING
from zerver.lib.remote_server import send_analytics_to_remote_server
from zerver.lib.timestamp import floor_to_hour
from zerver.models import Realm
class Command(BaseCommand):
help = """Fills Analytics tables.
@@ -60,18 +58,18 @@ class Command(BaseCommand):
fill_to_time = parse_datetime(options['time'])
if options['utc']:
fill_to_time = fill_to_time.replace(tzinfo=timezone.utc)
fill_to_time = fill_to_time.replace(tzinfo=timezone_utc)
if fill_to_time.tzinfo is None:
raise ValueError("--time must be timezone aware. Maybe you meant to use the --utc option?")
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone.utc))
fill_to_time = floor_to_hour(fill_to_time.astimezone(timezone_utc))
if options['stat'] is not None:
stats = [COUNT_STATS[options['stat']]]
else:
stats = list(COUNT_STATS.values())
logger.info("Starting updating analytics counts through %s", fill_to_time)
logger.info("Starting updating analytics counts through %s" % (fill_to_time,))
if options['verbose']:
start = time.time()
last = start
@@ -79,12 +77,10 @@ class Command(BaseCommand):
for stat in stats:
process_count_stat(stat, fill_to_time)
if options['verbose']:
print(f"Updated {stat.property} in {time.time() - last:.3f}s")
print("Updated %s in %.3fs" % (stat.property, time.time() - last))
last = time.time()
if options['verbose']:
print(f"Finished updating analytics counts through {fill_to_time} in {time.time() - start:.3f}s")
logger.info("Finished updating analytics counts through %s", fill_to_time)
if settings.PUSH_NOTIFICATION_BOUNCER_URL and settings.SUBMIT_USAGE_STATISTICS:
send_analytics_to_remote_server()
print("Finished updating analytics counts through %s in %.3fs" %
(fill_to_time, time.time() - start))
logger.info("Finished updating analytics counts through %s" % (fill_to_time,))

View File

@@ -2,12 +2,11 @@ import datetime
from argparse import ArgumentParser
from typing import Any
from django.core.management.base import BaseCommand, CommandError
from django.core.management.base import BaseCommand
from django.utils.timezone import now as timezone_now
from zerver.models import Message, Realm, Stream, UserProfile, get_realm
class Command(BaseCommand):
help = "Generate statistics on user activity."
@@ -18,25 +17,26 @@ class Command(BaseCommand):
def messages_sent_by(self, user: UserProfile, week: int) -> int:
start = timezone_now() - datetime.timedelta(days=(week + 1)*7)
end = timezone_now() - datetime.timedelta(days=week*7)
return Message.objects.filter(sender=user, date_sent__gt=start, date_sent__lte=end).count()
return Message.objects.filter(sender=user, pub_date__gt=start, pub_date__lte=end).count()
def handle(self, *args: Any, **options: Any) -> None:
if options['realms']:
try:
realms = [get_realm(string_id) for string_id in options['realms']]
except Realm.DoesNotExist as e:
raise CommandError(e)
print(e)
exit(1)
else:
realms = Realm.objects.all()
for realm in realms:
print(realm.string_id)
user_profiles = UserProfile.objects.filter(realm=realm, is_active=True)
print(f"{len(user_profiles)} users")
print(f"{len(Stream.objects.filter(realm=realm))} streams")
print("%d users" % (len(user_profiles),))
print("%d streams" % (len(Stream.objects.filter(realm=realm)),))
for user_profile in user_profiles:
print(f"{user_profile.email:>35}", end=' ')
print("%35s" % (user_profile.email,), end=' ')
for week in range(10):
print(f"{self.messages_sent_by(user_profile, week):5}", end=' ')
print("%5d" % (self.messages_sent_by(user_profile, week)), end=' ')
print("")

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration):
@@ -89,22 +91,22 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'end_time', 'interval')},
unique_together=set([('user', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'end_time', 'interval')},
unique_together=set([('stream', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'end_time', 'interval')},
unique_together=set([('realm', 'property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'end_time', 'interval')},
unique_together=set([('property', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='huddlecount',
unique_together={('huddle', 'property', 'end_time', 'interval')},
unique_together=set([('huddle', 'property', 'end_time', 'interval')]),
),
]

View File

@@ -1,5 +1,5 @@
from django.db import migrations
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
@@ -10,7 +10,7 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='huddlecount',
unique_together=set(),
unique_together=set([]),
),
migrations.RemoveField(
model_name='huddlecount',

View File

@@ -1,5 +1,7 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
import zerver.lib.str_utils
class Migration(migrations.Migration):

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [

View File

@@ -1,5 +1,5 @@
from django.db import migrations
# -*- coding: utf-8 -*-
from django.db import migrations, models
class Migration(migrations.Migration):
@@ -10,18 +10,18 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('realm', 'property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('stream', 'property', 'subgroup', 'end_time', 'interval')]),
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time', 'interval')},
unique_together=set([('user', 'property', 'subgroup', 'end_time', 'interval')]),
),
]

View File

@@ -1,7 +1,8 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-16 20:50
from django.conf import settings
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -11,7 +12,7 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together={('property', 'subgroup', 'end_time')},
unique_together=set([('property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='installationcount',
@@ -19,7 +20,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together={('realm', 'property', 'subgroup', 'end_time')},
unique_together=set([('realm', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='realmcount',
@@ -27,7 +28,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together={('stream', 'property', 'subgroup', 'end_time')},
unique_together=set([('stream', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='streamcount',
@@ -35,7 +36,7 @@ class Migration(migrations.Migration):
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together={('user', 'property', 'subgroup', 'end_time')},
unique_together=set([('user', 'property', 'subgroup', 'end_time')]),
),
migrations.RemoveField(
model_name='usercount',

View File

@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.5 on 2017-02-01 22:28
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
@@ -12,14 +12,14 @@ class Migration(migrations.Migration):
operations = [
migrations.AlterIndexTogether(
name='realmcount',
index_together={('property', 'end_time')},
index_together=set([('property', 'end_time')]),
),
migrations.AlterIndexTogether(
name='streamcount',
index_together={('property', 'realm', 'end_time')},
index_together=set([('property', 'realm', 'end_time')]),
),
migrations.AlterIndexTogether(
name='usercount',
index_together={('property', 'realm', 'end_time')},
index_together=set([('property', 'realm', 'end_time')]),
),
]

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def delete_messages_sent_to_stream_stat(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_message_sent_by_message_type_values(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,8 +1,8 @@
# -*- coding: utf-8 -*-
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.backends.postgresql_psycopg2.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
def clear_analytics_tables(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
UserCount = apps.get_model('analytics', 'UserCount')
StreamCount = apps.get_model('analytics', 'StreamCount')

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 08:14
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,32 +0,0 @@
# Generated by Django 1.11.18 on 2019-02-02 02:47
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('analytics', '0012_add_on_delete'),
]
operations = [
migrations.RemoveField(
model_name='installationcount',
name='anomaly',
),
migrations.RemoveField(
model_name='realmcount',
name='anomaly',
),
migrations.RemoveField(
model_name='streamcount',
name='anomaly',
),
migrations.RemoveField(
model_name='usercount',
name='anomaly',
),
migrations.DeleteModel(
name='Anomaly',
),
]

View File

@@ -1,17 +0,0 @@
# Generated by Django 1.11.26 on 2020-01-27 04:32
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('analytics', '0013_remove_anomaly'),
]
operations = [
migrations.RemoveField(
model_name='fillstate',
name='last_modified',
),
]

View File

@@ -1,53 +0,0 @@
from django.db import migrations
from django.db.backends.postgresql.schema import DatabaseSchemaEditor
from django.db.migrations.state import StateApps
from django.db.models import Count, Sum
def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor) -> None:
"""This is a preparatory migration for our Analytics tables.
The backstory is that Django's unique_together indexes do not properly
handle the subgroup=None corner case (allowing duplicate rows that have a
subgroup of None), which meant that in race conditions, rather than updating
an existing row for the property/realm/time with subgroup=None, Django would
create a duplicate row.
In the next migration, we'll add a proper constraint to fix this bug, but
we need to fix any existing problematic rows before we can add that constraint.
We fix this in an appropriate fashion for each type of CountStat object; mainly
this means deleting the extra rows, but for LoggingCountStat objects, we need to
additionally combine the sums.
"""
RealmCount = apps.get_model('analytics', 'RealmCount')
realm_counts = RealmCount.objects.filter(subgroup=None).values(
'realm_id', 'property', 'end_time').annotate(
Count('id'), Sum('value')).filter(id__count__gt=1)
for realm_count in realm_counts:
realm_count.pop('id__count')
total_value = realm_count.pop('value__sum')
duplicate_counts = list(RealmCount.objects.filter(**realm_count))
first_count = duplicate_counts[0]
if realm_count['property'] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
class Migration(migrations.Migration):
dependencies = [
('analytics', '0014_remove_fillstate_last_modified'),
]
operations = [
migrations.RunPython(clear_duplicate_counts,
reverse_code=migrations.RunPython.noop),
]

View File

@@ -1,61 +0,0 @@
# Generated by Django 2.2.10 on 2020-02-29 19:40
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('analytics', '0015_clear_duplicate_counts'),
]
operations = [
migrations.AlterUniqueTogether(
name='installationcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='realmcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='streamcount',
unique_together=set(),
),
migrations.AlterUniqueTogether(
name='usercount',
unique_together=set(),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('property', 'subgroup', 'end_time'), name='unique_installation_count'),
),
migrations.AddConstraint(
model_name='installationcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('property', 'end_time'), name='unique_installation_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('realm', 'property', 'subgroup', 'end_time'), name='unique_realm_count'),
),
migrations.AddConstraint(
model_name='realmcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('realm', 'property', 'end_time'), name='unique_realm_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('stream', 'property', 'subgroup', 'end_time'), name='unique_stream_count'),
),
migrations.AddConstraint(
model_name='streamcount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('stream', 'property', 'end_time'), name='unique_stream_count_null_subgroup'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=False), fields=('user', 'property', 'subgroup', 'end_time'), name='unique_user_count'),
),
migrations.AddConstraint(
model_name='usercount',
constraint=models.UniqueConstraint(condition=models.Q(subgroup__isnull=True), fields=('user', 'property', 'end_time'), name='unique_user_count_null_subgroup'),
),
]

View File

@@ -1,24 +1,24 @@
import datetime
from typing import Optional
from typing import Any, Dict, Optional, Tuple, Union
from django.db import models
from django.db.models import Q, UniqueConstraint
from zerver.lib.timestamp import floor_to_day
from zerver.models import Realm, Stream, UserProfile
from zerver.models import Realm, Recipient, Stream, UserProfile
class FillState(models.Model):
property: str = models.CharField(max_length=40, unique=True)
end_time: datetime.datetime = models.DateTimeField()
property = models.CharField(max_length=40, unique=True) # type: str
end_time = models.DateTimeField() # type: datetime.datetime
# Valid states are {DONE, STARTED}
DONE = 1
STARTED = 2
state: int = models.PositiveSmallIntegerField()
state = models.PositiveSmallIntegerField() # type: int
last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
def __str__(self) -> str:
return f"<FillState: {self.property} {self.end_time} {self.state}>"
return "<FillState: %s %s %s>" % (self.property, self.end_time, self.state)
# The earliest/starting end_time in FillState
# We assume there is at least one realm
@@ -34,14 +34,22 @@ def last_successful_fill(property: str) -> Optional[datetime.datetime]:
return fillstate.end_time
return fillstate.end_time - datetime.timedelta(hours=1)
# would only ever make entries here by hand
class Anomaly(models.Model):
info = models.CharField(max_length=1000) # type: str
def __str__(self) -> str:
return "<Anomaly: %s... %s>" % (self.info, self.id)
class BaseCount(models.Model):
# Note: When inheriting from BaseCount, you may want to rearrange
# the order of the columns in the migration to make sure they
# match how you'd like the table to be arranged.
property: str = models.CharField(max_length=32)
subgroup: Optional[str] = models.CharField(max_length=16, null=True)
end_time: datetime.datetime = models.DateTimeField()
value: int = models.BigIntegerField()
property = models.CharField(max_length=32) # type: str
subgroup = models.CharField(max_length=16, null=True) # type: Optional[str]
end_time = models.DateTimeField() # type: datetime.datetime
value = models.BigIntegerField() # type: int
anomaly = models.ForeignKey(Anomaly, on_delete=models.SET_NULL, null=True) # type: Optional[Anomaly]
class Meta:
abstract = True
@@ -49,83 +57,44 @@ class BaseCount(models.Model):
class InstallationCount(BaseCount):
class Meta:
# Handles invalid duplicate InstallationCount data
constraints = [
UniqueConstraint(
fields=["property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_installation_count'),
UniqueConstraint(
fields=["property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_installation_count_null_subgroup'),
]
unique_together = ("property", "subgroup", "end_time")
def __str__(self) -> str:
return f"<InstallationCount: {self.property} {self.subgroup} {self.value}>"
return "<InstallationCount: %s %s %s>" % (self.property, self.subgroup, self.value)
class RealmCount(BaseCount):
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate RealmCount data
constraints = [
UniqueConstraint(
fields=["realm", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_realm_count'),
UniqueConstraint(
fields=["realm", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_realm_count_null_subgroup'),
]
unique_together = ("realm", "property", "subgroup", "end_time")
index_together = ["property", "end_time"]
def __str__(self) -> str:
return f"<RealmCount: {self.realm} {self.property} {self.subgroup} {self.value}>"
return "<RealmCount: %s %s %s %s>" % (self.realm, self.property, self.subgroup, self.value)
class UserCount(BaseCount):
user = models.ForeignKey(UserProfile, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate UserCount data
constraints = [
UniqueConstraint(
fields=["user", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_user_count'),
UniqueConstraint(
fields=["user", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_user_count_null_subgroup'),
]
unique_together = ("user", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from users to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"<UserCount: {self.user} {self.property} {self.subgroup} {self.value}>"
return "<UserCount: %s %s %s %s>" % (self.user, self.property, self.subgroup, self.value)
class StreamCount(BaseCount):
stream = models.ForeignKey(Stream, on_delete=models.CASCADE)
realm = models.ForeignKey(Realm, on_delete=models.CASCADE)
class Meta:
# Handles invalid duplicate StreamCount data
constraints = [
UniqueConstraint(
fields=["stream", "property", "subgroup", "end_time"],
condition=Q(subgroup__isnull=False),
name='unique_stream_count'),
UniqueConstraint(
fields=["stream", "property", "end_time"],
condition=Q(subgroup__isnull=True),
name='unique_stream_count_null_subgroup'),
]
unique_together = ("stream", "property", "subgroup", "end_time")
# This index dramatically improves the performance of
# aggregating from streams to realms
index_together = ["property", "realm", "end_time"]
def __str__(self) -> str:
return f"<StreamCount: {self.stream} {self.property} {self.subgroup} {self.value} {self.id}>"
return "<StreamCount: %s %s %s %s %s>" % (
self.stream, self.property, self.subgroup, self.value, self.id)

View File

@@ -1,142 +1,90 @@
from datetime import datetime, timedelta, timezone
from typing import Any, Dict, List, Optional, Tuple, Type
from unittest import mock
from datetime import datetime, timedelta
from typing import Any, Dict, List, Optional, Tuple, Type, Union
import ujson
from django.apps import apps
from django.db import models
from django.db.models import Sum
from django.test import TestCase
from django.utils.timezone import now as timezone_now
from psycopg2.sql import SQL, Literal
from django.utils.timezone import utc as timezone_utc
from analytics.lib.counts import (
COUNT_STATS,
CountStat,
DependentCountStat,
LoggingCountStat,
do_aggregate_to_summary_table,
do_drop_all_analytics_tables,
do_drop_single_stat,
do_fill_count_stat_at_hour,
do_increment_logging_stat,
get_count_stats,
process_count_stat,
sql_data_collector,
)
from analytics.models import (
BaseCount,
FillState,
InstallationCount,
RealmCount,
StreamCount,
UserCount,
installation_epoch,
)
from zerver.lib.actions import (
InvitationError,
do_activate_user,
do_create_user,
do_deactivate_user,
do_invite_users,
do_mark_all_as_read,
do_mark_stream_messages_as_read,
do_reactivate_user,
do_resend_user_invite_email,
do_revoke_user_invite,
do_update_message_flags,
update_user_activity_interval,
)
from zerver.lib.create_user import create_user
from zerver.lib.test_classes import ZulipTestCase
from analytics.lib.counts import COUNT_STATS, CountStat, DataCollector, \
DependentCountStat, LoggingCountStat, do_aggregate_to_summary_table, \
do_drop_all_analytics_tables, do_drop_single_stat, \
do_fill_count_stat_at_hour, do_increment_logging_stat, \
process_count_stat, sql_data_collector
from analytics.models import Anomaly, BaseCount, \
FillState, InstallationCount, RealmCount, StreamCount, \
UserCount, installation_epoch, last_successful_fill
from zerver.lib.actions import do_activate_user, do_create_user, \
do_deactivate_user, do_reactivate_user, update_user_activity_interval, \
do_invite_users, do_revoke_user_invite, do_resend_user_invite_email, \
InvitationError
from zerver.lib.timestamp import TimezoneNotUTCException, floor_to_day
from zerver.lib.topic import DB_TOPIC_NAME
from zerver.models import (
Client,
Huddle,
Message,
PreregistrationUser,
Realm,
RealmAuditLog,
Recipient,
Stream,
UserActivityInterval,
UserProfile,
get_client,
get_user,
)
from zerver.models import Client, Huddle, Message, Realm, \
RealmAuditLog, Recipient, Stream, UserActivityInterval, \
UserProfile, get_client, get_user, PreregistrationUser
class AnalyticsTestCase(ZulipTestCase):
class AnalyticsTestCase(TestCase):
MINUTE = timedelta(seconds = 60)
HOUR = MINUTE * 60
DAY = HOUR * 24
TIME_ZERO = datetime(1988, 3, 14, tzinfo=timezone.utc)
TIME_ZERO = datetime(1988, 3, 14).replace(tzinfo=timezone_utc)
TIME_LAST_HOUR = TIME_ZERO - HOUR
def setUp(self) -> None:
super().setUp()
self.default_realm = Realm.objects.create(
string_id='realmtest', name='Realm Test', date_created=self.TIME_ZERO - 2*self.DAY)
# used to generate unique names in self.create_*
self.name_counter = 100
# used as defaults in self.assertCountEquals
self.current_property: Optional[str] = None
self.current_property = None # type: Optional[str]
# Lightweight creation of users, streams, and messages
def create_user(self, **kwargs: Any) -> UserProfile:
self.name_counter += 1
defaults = {
'email': f'user{self.name_counter}@domain.tld',
'email': 'user%s@domain.tld' % (self.name_counter,),
'date_joined': self.TIME_LAST_HOUR,
'full_name': 'full_name',
'short_name': 'short_name',
'is_active': True,
'is_bot': False,
'realm': self.default_realm}
'pointer': -1,
'last_pointer_updater': 'seems unused?',
'realm': self.default_realm,
'api_key': '42'}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
kwargs['delivery_email'] = kwargs['email']
with mock.patch("zerver.lib.create_user.timezone_now", return_value=kwargs['date_joined']):
pass_kwargs: Dict[str, Any] = {}
if kwargs['is_bot']:
pass_kwargs['bot_type'] = UserProfile.DEFAULT_BOT
pass_kwargs['bot_owner'] = None
return create_user(kwargs['email'], 'password', kwargs['realm'],
active=kwargs['is_active'],
full_name=kwargs['full_name'], short_name=kwargs['short_name'],
role=UserProfile.ROLE_REALM_ADMINISTRATOR, **pass_kwargs)
return UserProfile.objects.create(**kwargs)
def create_stream_with_recipient(self, **kwargs: Any) -> Tuple[Stream, Recipient]:
self.name_counter += 1
defaults = {'name': f'stream name {self.name_counter}',
defaults = {'name': 'stream name %s' % (self.name_counter,),
'realm': self.default_realm,
'date_created': self.TIME_LAST_HOUR}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
stream = Stream.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=stream.id, type=Recipient.STREAM)
stream.recipient = recipient
stream.save(update_fields=["recipient"])
return stream, recipient
def create_huddle_with_recipient(self, **kwargs: Any) -> Tuple[Huddle, Recipient]:
self.name_counter += 1
defaults = {'huddle_hash': f'hash{self.name_counter}'}
defaults = {'huddle_hash': 'hash%s' % (self.name_counter,)}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
huddle = Huddle.objects.create(**kwargs)
recipient = Recipient.objects.create(type_id=huddle.id, type=Recipient.HUDDLE)
huddle.recipient = recipient
huddle.save(update_fields=["recipient"])
return huddle, recipient
def create_message(self, sender: UserProfile, recipient: Recipient, **kwargs: Any) -> Message:
defaults = {
'sender': sender,
'recipient': recipient,
DB_TOPIC_NAME: 'subject',
'subject': 'subject',
'content': 'hi',
'date_sent': self.TIME_LAST_HOUR,
'pub_date': self.TIME_LAST_HOUR,
'sending_client': get_client("website")}
for key, value in defaults.items():
kwargs[key] = kwargs.get(key, value)
@@ -186,7 +134,7 @@ class AnalyticsTestCase(ZulipTestCase):
'end_time': self.TIME_ZERO,
'value': 1}
for values in arg_values:
kwargs: Dict[str, Any] = {}
kwargs = {} # type: Dict[str, Any]
for i in range(len(values)):
kwargs[arg_keys[i]] = values[i]
for key, value in defaults.items():
@@ -204,13 +152,8 @@ class AnalyticsTestCase(ZulipTestCase):
class TestProcessCountStat(AnalyticsTestCase):
def make_dummy_count_stat(self, property: str) -> CountStat:
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal(property),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, property)
return CountStat(property, sql_data_collector(RealmCount, query, None), CountStat.HOUR)
def assertFillStateEquals(self, stat: CountStat, end_time: datetime,
@@ -308,13 +251,8 @@ class TestProcessCountStat(AnalyticsTestCase):
def test_process_dependent_stat(self) -> None:
stat1 = self.make_dummy_count_stat('stat1')
stat2 = self.make_dummy_count_stat('stat2')
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat3'),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat3')
stat3 = DependentCountStat('stat3', sql_data_collector(RealmCount, query, None),
CountStat.HOUR,
dependencies=['stat1', 'stat2'])
@@ -347,13 +285,8 @@ class TestProcessCountStat(AnalyticsTestCase):
self.assertFillStateEquals(stat3, hour[2])
# test daily dependent stat with hourly dependencies
query = lambda kwargs: SQL("""
INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES ({default_realm_id}, 1, {property}, %(time_end)s)
""").format(
default_realm_id=Literal(self.default_realm.id),
property=Literal('stat4'),
)
query = """INSERT INTO analytics_realmcount (realm_id, value, property, end_time)
VALUES (%s, 1, '%s', %%%%(time_end)s)""" % (self.default_realm.id, 'stat4')
stat4 = DependentCountStat('stat4', sql_data_collector(RealmCount, query, None),
CountStat.DAY,
dependencies=['stat1', 'stat2'])
@@ -376,12 +309,12 @@ class TestCountStats(AnalyticsTestCase):
date_created=self.TIME_ZERO-2*self.DAY)
for minutes_ago in [0, 1, 61, 60*24+1]:
creation_time = self.TIME_ZERO - minutes_ago*self.MINUTE
user = self.create_user(email=f'user-{minutes_ago}@second.analytics',
user = self.create_user(email='user-%s@second.analytics' % (minutes_ago,),
realm=self.second_realm, date_joined=creation_time)
recipient = self.create_stream_with_recipient(
name=f'stream {minutes_ago}', realm=self.second_realm,
name='stream %s' % (minutes_ago,), realm=self.second_realm,
date_created=creation_time)[1]
self.create_message(user, recipient, date_sent=creation_time)
self.create_message(user, recipient, pub_date=creation_time)
self.hourly_user = get_user('user-1@second.analytics', self.second_realm)
self.daily_user = get_user('user-61@second.analytics', self.second_realm)
@@ -419,29 +352,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_active_users_by_is_bot_for_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['active_users:is_bot:day']
self.current_property = stat.property
# To be included
self.create_user(is_bot=True, date_joined=self.TIME_ZERO-25*self.HOUR)
self.create_user(is_bot=False)
# To be excluded
self.create_user(email='test@second.analytics',
realm=self.second_realm, date_joined=self.TIME_ZERO-2*self.DAY)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'true'], [1, 'false']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
@@ -449,8 +359,8 @@ class TestCountStats(AnalyticsTestCase):
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_human1 = Recipient.objects.create(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
@@ -471,46 +381,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[3, 'false'], [3, 'true']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:is_bot:hour']
self.current_property = stat.property
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id,
type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(bot, recipient_human1)
self.create_message(bot, recipient_stream)
self.create_message(bot, recipient_huddle)
self.create_message(human1, recipient_human1)
self.create_message(human2, recipient_human1)
# To be excluded
self.create_message(self.hourly_user, recipient_human1)
self.create_message(self.hourly_user, recipient_stream)
self.create_message(self.hourly_user, recipient_huddle)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'false', human1], [1, 'false', human2],
[3, 'true', bot]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[2, 'false', self.default_realm],
[3, 'true', self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
@@ -544,9 +414,9 @@ class TestCountStats(AnalyticsTestCase):
self.create_message(user2, recipient_huddle2)
# private messages
recipient_user1 = Recipient.objects.get(type_id=user1.id, type=Recipient.PERSONAL)
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL)
recipient_user3 = Recipient.objects.get(type_id=user3.id, type=Recipient.PERSONAL)
recipient_user1 = Recipient.objects.create(type_id=user1.id, type=Recipient.PERSONAL)
recipient_user2 = Recipient.objects.create(type_id=user2.id, type=Recipient.PERSONAL)
recipient_user3 = Recipient.objects.create(type_id=user3.id, type=Recipient.PERSONAL)
self.create_message(user1, recipient_user2)
self.create_message(user2, recipient_user1)
self.create_message(user3, recipient_user3)
@@ -573,49 +443,12 @@ class TestCountStats(AnalyticsTestCase):
[2, 'huddle_message']])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_message_type_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
user = self.create_user()
user_recipient = Recipient.objects.get(type_id=user.id, type=Recipient.PERSONAL)
private_stream_recipient = self.create_stream_with_recipient(invite_only=True)[1]
stream_recipient = self.create_stream_with_recipient()[1]
huddle_recipient = self.create_huddle_with_recipient()[1]
# To be included
self.create_message(user, user_recipient)
self.create_message(user, private_stream_recipient)
self.create_message(user, stream_recipient)
self.create_message(user, huddle_recipient)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
# To be excluded
self.create_message(self.hourly_user, user_recipient)
self.create_message(self.hourly_user, private_stream_recipient)
self.create_message(self.hourly_user, stream_recipient)
self.create_message(self.hourly_user, huddle_recipient)
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, 'private_message', user], [1, 'private_stream', user],
[1, 'huddle_message', user], [1, 'public_stream', user]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, 'private_message'], [1, 'private_stream'],
[1, 'public_stream'], [1, 'huddle_message']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_recipients_with_same_id(self) -> None:
stat = COUNT_STATS['messages_sent:message_type:day']
self.current_property = stat.property
user = self.create_user(id=1000)
user_recipient = Recipient.objects.get(type_id=user.id, type=Recipient.PERSONAL)
user_recipient = Recipient.objects.create(type_id=user.id, type=Recipient.PERSONAL)
stream_recipient = self.create_stream_with_recipient(id=1000)[1]
huddle_recipient = self.create_huddle_with_recipient(id=1000)[1]
@@ -635,7 +468,7 @@ class TestCountStats(AnalyticsTestCase):
user1 = self.create_user(is_bot=True)
user2 = self.create_user()
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL)
recipient_user2 = Recipient.objects.create(type_id=user2.id, type=Recipient.PERSONAL)
recipient_stream = self.create_stream_with_recipient()[1]
recipient_huddle = self.create_huddle_with_recipient()[1]
@@ -664,42 +497,6 @@ class TestCountStats(AnalyticsTestCase):
[[4, website_client_id], [3, client2_id]])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_by_client_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_sent:client:day']
self.current_property = stat.property
user1 = self.create_user(is_bot=True)
user2 = self.create_user()
recipient_user2 = Recipient.objects.get(type_id=user2.id, type=Recipient.PERSONAL)
client2 = Client.objects.create(name='client2')
# TO be included
self.create_message(user1, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2, sending_client=client2)
self.create_message(user2, recipient_user2)
# To be excluded
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2, sending_client=client2)
self.create_message(self.hourly_user, recipient_user2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
client2_id = str(client2.id)
website_client_id = str(get_client('website').id) # default for self.create_message
self.assertTableState(UserCount, ['value', 'subgroup', 'user'],
[[1, client2_id, user1], [1, client2_id, user2],
[1, website_client_id, user2]])
self.assertTableState(RealmCount, ['value', 'subgroup'],
[[1, website_client_id], [2, client2_id]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(StreamCount, [], [])
def test_messages_sent_to_stream_by_is_bot(self) -> None:
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
@@ -707,7 +504,7 @@ class TestCountStats(AnalyticsTestCase):
bot = self.create_user(is_bot=True)
human1 = self.create_user()
human2 = self.create_user()
recipient_human1 = Recipient.objects.get(type_id=human1.id, type=Recipient.PERSONAL)
recipient_human1 = Recipient.objects.create(type_id=human1.id, type=Recipient.PERSONAL)
stream1, recipient_stream1 = self.create_stream_with_recipient()
stream2, recipient_stream2 = self.create_stream_with_recipient()
@@ -737,39 +534,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value', 'subgroup'], [[5, 'false'], [2, 'true']])
self.assertTableState(UserCount, [], [])
def test_messages_sent_to_stream_by_is_bot_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['messages_in_stream:is_bot:day']
self.current_property = stat.property
human1 = self.create_user()
bot = self.create_user(is_bot=True)
realm = {'realm': self.second_realm}
stream1, recipient_stream1 = self.create_stream_with_recipient()
stream2, recipient_stream2 = self.create_stream_with_recipient(**realm)
# To be included
self.create_message(human1, recipient_stream1)
self.create_message(bot, recipient_stream1)
# To be excluded
self.create_message(self.hourly_user, recipient_stream2)
self.create_message(self.daily_user, recipient_stream2)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(StreamCount, ['value', 'subgroup', 'stream'],
[[1, 'false', stream1],
[1, 'true', stream1]])
self.assertTableState(RealmCount, ['value', 'subgroup', 'realm'],
[[1, 'false'], [1, 'true']])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value', 'subgroup'], [])
self.assertTableState(UserCount, [], [])
def create_interval(self, user: UserProfile, start_offset: timedelta,
end_offset: timedelta) -> None:
UserActivityInterval.objects.create(
@@ -819,34 +583,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_1day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['1day_actives::day']
self.current_property = stat.property
_1day = 1*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, _1day + self.DAY, _1day)
# To be excluded
user3 = self.create_user(realm=self.second_realm)
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user2], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_15day_actives(self) -> None:
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
@@ -890,36 +626,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[6]])
self.assertTableState(StreamCount, [], [])
def test_15day_actives_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['15day_actives::day']
self.current_property = stat.property
_15day = 15*self.DAY - UserActivityInterval.MIN_INTERVAL_LENGTH
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, _15day + self.DAY, _15day)
self.create_interval(user2, 20*self.HOUR, 19*self.HOUR)
# To be excluded
self.create_interval(user3, 20*self.HOUR, 19*self.HOUR)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[1, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[2, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
def test_minutes_active(self) -> None:
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
@@ -962,35 +668,6 @@ class TestCountStats(AnalyticsTestCase):
self.assertTableState(InstallationCount, ['value'], [[61 + 121 + 24*60 + 1]])
self.assertTableState(StreamCount, [], [])
def test_minutes_active_realm_constraint(self) -> None:
# For single Realm
COUNT_STATS = get_count_stats(self.default_realm)
stat = COUNT_STATS['minutes_active::day']
self.current_property = stat.property
# Outside time range, should not appear. Also testing for intervals
# starting and ending on boundary
user1 = self.create_user()
user2 = self.create_user()
user3 = self.create_user(realm=self.second_realm)
# To be included
self.create_interval(user1, 20*self.HOUR, 19*self.HOUR)
self.create_interval(user2, 20*self.MINUTE, 19*self.MINUTE)
# To be excluded
self.create_interval(user3, 20*self.MINUTE, 19*self.MINUTE)
do_fill_count_stat_at_hour(stat, self.TIME_ZERO, self.default_realm)
self.assertTableState(UserCount, ['value', 'user'],
[[60, user1], [1, user2]])
self.assertTableState(RealmCount, ['value', 'realm'],
[[60 + 1, self.default_realm]])
# No aggregation to InstallationCount with realm constraint
self.assertTableState(InstallationCount, ['value'], [])
self.assertTableState(StreamCount, [], [])
class TestDoAggregateToSummaryTable(AnalyticsTestCase):
# do_aggregate_to_summary_table is mostly tested by the end to end
# nature of the tests in TestCountStats. But want to highlight one
@@ -1057,12 +734,12 @@ class TestDoIncrementLoggingStat(AnalyticsTestCase):
self.current_property = 'test'
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[1, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]])
[1, 'subgroup1', self.TIME_LAST_HOUR]])
# This should trigger the get part of get_or_create
do_increment_logging_stat(self.default_realm, stat, 'subgroup1', self.TIME_ZERO)
self.assertTableState(RealmCount, ['value', 'subgroup', 'end_time'],
[[2, 'subgroup1', self.TIME_ZERO], [1, 'subgroup2', self.TIME_ZERO],
[1, 'subgroup1', self.TIME_LAST_HOUR]])
[1, 'subgroup1', self.TIME_LAST_HOUR]])
def test_increment(self) -> None:
stat = LoggingCountStat('test', RealmCount, CountStat.DAY)
@@ -1154,39 +831,6 @@ class TestLoggingCountStats(AnalyticsTestCase):
do_resend_user_invite_email(PreregistrationUser.objects.first())
assertInviteCountEquals(6)
def test_messages_read_hour(self) -> None:
read_count_property = 'messages_read::hour'
interactions_property = 'messages_read_interactions::hour'
user1 = self.create_user()
user2 = self.create_user()
stream, recipient = self.create_stream_with_recipient()
self.subscribe(user1, stream.name)
self.subscribe(user2, stream.name)
self.send_personal_message(user1, user2)
client = get_client("website")
do_mark_all_as_read(user2, client)
self.assertEqual(1, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(1, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
self.send_stream_message(user1, stream.name)
self.send_stream_message(user1, stream.name)
do_mark_stream_messages_as_read(user2, client, stream)
self.assertEqual(3, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(2, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
message = self.send_stream_message(user2, stream.name)
do_update_message_flags(user1, client, 'add', 'read', [message])
self.assertEqual(4, UserCount.objects.filter(property=read_count_property)
.aggregate(Sum('value'))['value__sum'])
self.assertEqual(3, UserCount.objects.filter(property=interactions_property)
.aggregate(Sum('value'))['value__sum'])
class TestDeleteStats(AnalyticsTestCase):
def test_do_drop_all_analytics_tables(self) -> None:
user = self.create_user()
@@ -1198,6 +842,7 @@ class TestDeleteStats(AnalyticsTestCase):
RealmCount.objects.create(realm=user.realm, **count_args)
InstallationCount.objects.create(**count_args)
FillState.objects.create(property='test', end_time=self.TIME_ZERO, state=FillState.DONE)
Anomaly.objects.create(info='test anomaly')
analytics = apps.get_app_config('analytics')
for table in list(analytics.models.values()):
@@ -1220,6 +865,7 @@ class TestDeleteStats(AnalyticsTestCase):
InstallationCount.objects.create(**count_args)
FillState.objects.create(property='to_delete', end_time=self.TIME_ZERO, state=FillState.DONE)
FillState.objects.create(property='to_save', end_time=self.TIME_ZERO, state=FillState.DONE)
Anomaly.objects.create(info='test anomaly')
analytics = apps.get_app_config('analytics')
for table in list(analytics.models.values()):
@@ -1227,8 +873,11 @@ class TestDeleteStats(AnalyticsTestCase):
do_drop_single_stat('to_delete')
for table in list(analytics.models.values()):
self.assertFalse(table.objects.filter(property='to_delete').exists())
self.assertTrue(table.objects.filter(property='to_save').exists())
if table._meta.db_table == 'analytics_anomaly':
self.assertTrue(table.objects.exists())
else:
self.assertFalse(table.objects.filter(property='to_delete').exists())
self.assertTrue(table.objects.filter(property='to_save').exists())
class TestActiveUsersAudit(AnalyticsTestCase):
def setUp(self) -> None:
@@ -1237,7 +886,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.stat = COUNT_STATS['active_users_audit:is_bot:day']
self.current_property = self.stat.property
def add_event(self, event_type: int, days_offset: float,
def add_event(self, event_type: str, days_offset: float,
user: Optional[UserProfile]=None) -> None:
hours_offset = int(24*days_offset)
if user is None:
@@ -1331,7 +980,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
# doesn't go through do_create_user. Mainly just want to make sure that
# that situation doesn't throw an error.
def test_empty_realm_or_user_with_no_relevant_activity(self) -> None:
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, 1)
self.add_event('unrelated', 1)
self.create_user() # also test a user with no RealmAuditLog entries
Realm.objects.create(string_id='moo', name='moo')
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
@@ -1339,14 +988,14 @@ class TestActiveUsersAudit(AnalyticsTestCase):
def test_max_audit_entry_is_unrelated(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1)
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, .5)
self.add_event('unrelated', .5)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
# Simultaneous related audit entries should not be allowed, and so not testing for that.
def test_simultaneous_unrelated_audit_entry(self) -> None:
self.add_event(RealmAuditLog.USER_CREATED, 1)
self.add_event(RealmAuditLog.USER_SOFT_ACTIVATED, 1)
self.add_event('unrelated', 1)
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO)
self.assertTableState(UserCount, ['subgroup'], [['false']])
@@ -1376,7 +1025,7 @@ class TestActiveUsersAudit(AnalyticsTestCase):
self.assertTrue(UserCount.objects.filter(
user=user, property=self.current_property, subgroup='false',
end_time=end_time, value=1).exists())
self.assertFalse(UserCount.objects.filter(user=user2, end_time=end_time).exists())
self.assertFalse(UserCount.objects.filter(user=user2).exists())
class TestRealmActiveHumans(AnalyticsTestCase):
def setUp(self) -> None:
@@ -1448,7 +1097,6 @@ class TestRealmActiveHumans(AnalyticsTestCase):
self.create_user(realm=third_realm)
RealmCount.objects.all().delete()
InstallationCount.objects.all().delete()
for i in [-1, 0, 1]:
do_fill_count_stat_at_hour(self.stat, self.TIME_ZERO + i*self.DAY)
self.assertTableState(RealmCount, ['value', 'realm', 'end_time'],

View File

@@ -2,7 +2,6 @@ from analytics.lib.counts import CountStat
from analytics.lib.fixtures import generate_time_series_data
from zerver.lib.test_classes import ZulipTestCase
# A very light test suite; the code being tested is not run in production.
class TestFixtures(ZulipTestCase):
def test_deterministic_settings(self) -> None:

View File

@@ -1,27 +1,24 @@
from datetime import datetime, timedelta, timezone
from typing import List, Optional
from unittest import mock
from datetime import datetime, timedelta
from typing import Dict, List, Optional
import ujson
from django.http import HttpResponse
from django.utils.timezone import now as timezone_now
import mock
from django.utils.timezone import utc
from analytics.lib.counts import COUNT_STATS, CountStat
from analytics.lib.time_utils import time_range
from analytics.models import FillState, RealmCount, UserCount, last_successful_fill
from analytics.views import rewrite_client_arrays, sort_by_totals, sort_client_labels
from corporate.models import get_customer_by_realm
from zerver.lib.actions import do_create_multiuse_invite_link, do_send_realm_reactivation_email
from analytics.models import FillState, \
RealmCount, UserCount, last_successful_fill
from analytics.views import get_chart_data, rewrite_client_arrays, \
sort_by_totals, sort_client_labels, stats
from zerver.lib.test_classes import ZulipTestCase
from zerver.lib.test_helpers import reset_emails_in_zulip_realm
from zerver.lib.timestamp import ceiling_to_day, ceiling_to_hour, datetime_to_timestamp
from zerver.models import Client, MultiuseInvite, PreregistrationUser, get_realm
from zerver.lib.timestamp import ceiling_to_day, \
ceiling_to_hour, datetime_to_timestamp
from zerver.models import Client, get_realm
class TestStatsEndpoint(ZulipTestCase):
def test_stats(self) -> None:
self.user = self.example_user('hamlet')
self.login_user(self.user)
self.login(self.user.email)
result = self.client_get('/stats')
self.assertEqual(result.status_code, 200)
# Check that we get something back
@@ -29,7 +26,7 @@ class TestStatsEndpoint(ZulipTestCase):
def test_guest_user_cant_access_stats(self) -> None:
self.user = self.example_user('polonius')
self.login_user(self.user)
self.login(self.user.email)
result = self.client_get('/stats')
self.assert_json_error(result, "Not allowed for guest users", 400)
@@ -37,15 +34,15 @@ class TestStatsEndpoint(ZulipTestCase):
self.assert_json_error(result, "Not allowed for guest users", 400)
def test_stats_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/stats/realm/zulip/')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
result = self.client_get('/stats/realm/not_existing_realm/')
self.assertEqual(result.status_code, 302)
@@ -55,15 +52,15 @@ class TestStatsEndpoint(ZulipTestCase):
self.assert_in_response("Zulip analytics for", result)
def test_stats_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 302)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
result = self.client_get('/stats/installation')
self.assertEqual(result.status_code, 200)
@@ -71,10 +68,9 @@ class TestStatsEndpoint(ZulipTestCase):
class TestGetChartData(ZulipTestCase):
def setUp(self) -> None:
super().setUp()
self.realm = get_realm('zulip')
self.user = self.example_user('hamlet')
self.login_user(self.user)
self.login(self.user.email)
self.end_times_hour = [ceiling_to_hour(self.realm.date_created) + timedelta(hours=i)
for i in range(4)]
self.end_times_day = [ceiling_to_day(self.realm.date_created) + timedelta(days=i)
@@ -182,23 +178,6 @@ class TestGetChartData(ZulipTestCase):
'result': 'success',
})
def test_messages_read_over_time(self) -> None:
stat = COUNT_STATS['messages_read::hour']
self.insert_data(stat, [None], [])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_read_over_time'})
self.assert_json_success(result)
data = result.json()
self.assertEqual(data, {
'msg': '',
'end_times': [datetime_to_timestamp(dt) for dt in self.end_times_hour],
'frequency': CountStat.HOUR,
'everyone': {'read': self.data(100)},
'user': {'read': self.data(0)},
'display_order': None,
'result': 'success',
})
def test_include_empty_subgroups(self) -> None:
FillState.objects.create(
property='realm_active_humans::day', end_time=self.end_times_day[0],
@@ -301,86 +280,24 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_error_contains(result, 'Unknown chart name')
def test_analytics_not_running(self) -> None:
realm = get_realm("zulip")
self.assertEqual(FillState.objects.count(), 0)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
# try to get data for a valid chart, but before we've put anything in the database
# (e.g. before update_analytics_counts has been run)
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
{'chart_name': 'number_of_humans'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(hours=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=5)
fill_state = FillState.objects.create(property='messages_sent:is_bot:hour', end_time=end_time,
state=FillState.DONE)
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
end_time = timezone_now() - timedelta(days=2)
fill_state.end_time = end_time
fill_state.save(update_fields=["end_time"])
realm.date_created = timezone_now() - timedelta(days=3)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
realm.date_created = timezone_now() - timedelta(days=1, hours=2)
realm.save(update_fields=["date_created"])
with mock.patch('logging.warning'):
result = self.client_get('/json/analytics/chart_data',
{'chart_name': 'messages_sent_over_time'})
self.assert_json_error_contains(result, 'No analytics data available')
realm.date_created = timezone_now() - timedelta(days=1, minutes=10)
realm.save(update_fields=["date_created"])
result = self.client_get('/json/analytics/chart_data', {'chart_name': 'messages_sent_over_time'})
self.assert_json_success(result)
def test_get_chart_data_for_realm(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/json/analytics/chart_data/realm/zulip',
result = self.client_get('/json/analytics/chart_data/realm/zulip/',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
@@ -393,16 +310,16 @@ class TestGetChartData(ZulipTestCase):
self.assert_json_success(result)
def test_get_chart_data_for_installation(self) -> None:
user = self.example_user('hamlet')
self.login_user(user)
user_profile = self.example_user('hamlet')
self.login(user_profile.email)
result = self.client_get('/json/analytics/chart_data/installation',
{'chart_name': 'number_of_humans'})
self.assert_json_error(result, "Must be an server administrator", 400)
user = self.example_user('hamlet')
user.is_staff = True
user.save(update_fields=['is_staff'])
user_profile = self.example_user('hamlet')
user_profile.is_staff = True
user_profile.save(update_fields=['is_staff'])
stat = COUNT_STATS['realm_active_humans::day']
self.insert_data(stat, [None], [])
@@ -410,250 +327,13 @@ class TestGetChartData(ZulipTestCase):
{'chart_name': 'number_of_humans'})
self.assert_json_success(result)
class TestSupportEndpoint(ZulipTestCase):
def test_search(self) -> None:
reset_emails_in_zulip_realm()
def check_hamlet_user_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">user</span>\n', '<h3>King Hamlet</h3>',
'<b>Email</b>: hamlet@zulip.com', '<b>Is active</b>: True<br>',
'<b>Admins</b>: desdemona@zulip.com, iago@zulip.com\n',
'class="copy-button" data-copytext="desdemona@zulip.com, iago@zulip.com"',
], result)
def check_zulip_realm_query_result(result: HttpResponse) -> None:
zulip_realm = get_realm("zulip")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{zulip_realm.id}"',
'Zulip Dev</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
'input type="number" name="discount" value="None"',
'<option value="active" selected>Active</option>',
'<option value="deactivated" >Deactivated</option>',
'scrub-realm-button">',
'data-string-id="zulip"'], result)
def check_lear_realm_query_result(result: HttpResponse) -> None:
lear_realm = get_realm("lear")
self.assert_in_success_response([f'<input type="hidden" name="realm_id" value="{lear_realm.id}"',
'Lear &amp; Co.</h3>',
'<option value="1" selected>Self Hosted</option>',
'<option value="2" >Limited</option>',
'input type="number" name="discount" value="None"',
'<option value="active" selected>Active</option>',
'<option value="deactivated" >Deactivated</option>',
'scrub-realm-button">',
'data-string-id="lear"'], result)
def check_preregistration_user_query_result(result: HttpResponse, email: str, invite: bool=False) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
f'<b>Email</b>: {email}',
], result)
if invite:
self.assert_in_success_response(['<span class="label">invite</span>'], result)
self.assert_in_success_response(['<b>Expires in</b>: 1\xa0week, 3',
'<b>Status</b>: Link has never been clicked'], result)
self.assert_in_success_response([], result)
else:
self.assert_not_in_success_response(['<span class="label">invite</span>'], result)
self.assert_in_success_response(['<b>Expires in</b>: 1\xa0day',
'<b>Status</b>: Link has never been clicked'], result)
def check_realm_creation_query_result(result: HttpResponse, email: str) -> None:
self.assert_in_success_response(['<span class="label">preregistration user</span>\n',
'<span class="label">realm creation</span>\n',
'<b>Link</b>: http://testserver/accounts/do_confirm/',
'<b>Expires in</b>: 1\xa0day<br>\n',
], result)
def check_multiuse_invite_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">multiuse invite</span>\n',
'<b>Link</b>: http://zulip.testserver/join/',
'<b>Expires in</b>: 1\xa0week, 3',
], result)
def check_realm_reactivation_link_query_result(result: HttpResponse) -> None:
self.assert_in_success_response(['<span class="label">realm reactivation</span>\n',
'<b>Link</b>: http://zulip.testserver/reactivate/',
'<b>Expires in</b>: 1\xa0day',
], result)
self.login('cordelia')
result = self.client_get("/activity/support")
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
result = self.client_get("/activity/support")
self.assert_in_success_response(['<input type="text" name="q" class="input-xxlarge search-query"'], result)
result = self.client_get("/activity/support", {"q": "hamlet@zulip.com"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "lear"})
check_lear_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "http://lear.testserver"})
check_lear_realm_query_result(result)
with self.settings(REALM_HOSTS={'zulip': 'localhost'}):
result = self.client_get("/activity/support", {"q": "http://localhost"})
check_zulip_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "hamlet@zulip.com, lear"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
result = self.client_get("/activity/support", {"q": "lear, Hamlet <hamlet@zulip.com>"})
check_hamlet_user_query_result(result)
check_zulip_realm_query_result(result)
check_lear_realm_query_result(result)
self.client_post('/accounts/home/', {'email': self.nonreg_email("test")})
self.login('iago')
result = self.client_get("/activity/support", {"q": self.nonreg_email("test")})
check_preregistration_user_query_result(result, self.nonreg_email("test"))
check_zulip_realm_query_result(result)
stream_ids = [self.get_stream_id("Denmark")]
invitee_emails = [self.nonreg_email("test1")]
self.client_post("/json/invites", {"invitee_emails": invitee_emails,
"stream_ids": ujson.dumps(stream_ids),
"invite_as": PreregistrationUser.INVITE_AS['MEMBER']})
result = self.client_get("/activity/support", {"q": self.nonreg_email("test1")})
check_preregistration_user_query_result(result, self.nonreg_email("test1"), invite=True)
check_zulip_realm_query_result(result)
email = self.nonreg_email('alice')
self.client_post('/new/', {'email': email})
result = self.client_get("/activity/support", {"q": email})
check_realm_creation_query_result(result, email)
do_create_multiuse_invite_link(self.example_user("hamlet"), invited_as=1)
result = self.client_get("/activity/support", {"q": "zulip"})
check_multiuse_invite_link_query_result(result)
check_zulip_realm_query_result(result)
MultiuseInvite.objects.all().delete()
do_send_realm_reactivation_email(get_realm("zulip"))
result = self.client_get("/activity/support", {"q": "zulip"})
check_realm_reactivation_link_query_result(result)
check_zulip_realm_query_result(result)
def test_change_plan_type(self) -> None:
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{cordelia.realm_id}", "plan_type": "2"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
with mock.patch("analytics.views.do_change_plan_type") as m:
result = self.client_post("/activity/support", {"realm_id": f"{iago.realm_id}", "plan_type": "2"})
m.assert_called_once_with(get_realm("zulip"), 2)
self.assert_in_success_response(["Plan type of Zulip Dev changed from self hosted to limited"], result)
def test_attach_discount(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.attach_discount_to_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
m.assert_called_once_with(get_realm("lear"), 25)
self.assert_in_success_response(["Discount of Lear &amp; Co. changed to 25 from None"], result)
def test_change_sponsorship_status(self) -> None:
lear_realm = get_realm("lear")
self.assertIsNone(get_customer_by_realm(lear_realm))
cordelia = self.example_user('cordelia')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
iago = self.example_user("iago")
self.login_user(iago)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "true"})
self.assert_in_success_response(["Lear &amp; Co. marked as pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertTrue(customer.sponsorship_pending)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}",
"sponsorship_pending": "false"})
self.assert_in_success_response(["Lear &amp; Co. is no longer pending sponsorship."], result)
customer = get_customer_by_realm(lear_realm)
assert(customer is not None)
self.assertFalse(customer.sponsorship_pending)
def test_activate_or_deactivate_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.do_deactivate_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "deactivated"})
m.assert_called_once_with(lear_realm, self.example_user("iago"))
self.assert_in_success_response(["Lear &amp; Co. deactivated"], result)
with mock.patch("analytics.views.do_send_realm_reactivation_email") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "status": "active"})
m.assert_called_once_with(lear_realm)
self.assert_in_success_response(["Realm reactivation email sent to admins of Lear"], result)
def test_scrub_realm(self) -> None:
cordelia = self.example_user('cordelia')
lear_realm = get_realm('lear')
self.login_user(cordelia)
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "discount": "25"})
self.assertEqual(result.status_code, 302)
self.assertEqual(result["Location"], "/login/")
self.login('iago')
with mock.patch("analytics.views.do_scrub_realm") as m:
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}", "scrub_realm": "scrub_realm"})
m.assert_called_once_with(lear_realm, acting_user=self.example_user("iago"))
self.assert_in_success_response(["Lear &amp; Co. scrubbed"], result)
with mock.patch("analytics.views.do_scrub_realm") as m:
with self.assertRaises(AssertionError):
result = self.client_post("/activity/support", {"realm_id": f"{lear_realm.id}"})
m.assert_not_called()
class TestGetChartDataHelpers(ZulipTestCase):
# last_successful_fill is in analytics/models.py, but get_chart_data is
# the only function that uses it at the moment
def test_last_successful_fill(self) -> None:
self.assertIsNone(last_successful_fill('non-existant'))
a_time = datetime(2016, 3, 14, 19, tzinfo=timezone.utc)
one_hour_before = datetime(2016, 3, 14, 18, tzinfo=timezone.utc)
a_time = datetime(2016, 3, 14, 19).replace(tzinfo=utc)
one_hour_before = datetime(2016, 3, 14, 18).replace(tzinfo=utc)
fillstate = FillState.objects.create(property='property', end_time=a_time,
state=FillState.DONE)
self.assertEqual(last_successful_fill('property'), a_time)
@@ -662,7 +342,7 @@ class TestGetChartDataHelpers(ZulipTestCase):
self.assertEqual(last_successful_fill('property'), one_hour_before)
def test_sort_by_totals(self) -> None:
empty: List[int] = []
empty = [] # type: List[int]
value_arrays = {'c': [0, 1], 'a': [9], 'b': [1, 1, 1], 'd': empty}
self.assertEqual(sort_by_totals(value_arrays), ['a', 'b', 'c', 'd'])
@@ -676,9 +356,9 @@ class TestTimeRange(ZulipTestCase):
HOUR = timedelta(hours=1)
DAY = timedelta(days=1)
a_time = datetime(2016, 3, 14, 22, 59, tzinfo=timezone.utc)
floor_hour = datetime(2016, 3, 14, 22, tzinfo=timezone.utc)
floor_day = datetime(2016, 3, 14, tzinfo=timezone.utc)
a_time = datetime(2016, 3, 14, 22, 59).replace(tzinfo=utc)
floor_hour = datetime(2016, 3, 14, 22).replace(tzinfo=utc)
floor_day = datetime(2016, 3, 14).replace(tzinfo=utc)
# test start == end
self.assertEqual(time_range(a_time, a_time, CountStat.HOUR, None), [])

View File

@@ -1,34 +1,25 @@
from django.conf.urls import include
from django.urls import path
from django.conf.urls import include, url
import analytics.views
from zerver.lib.rest import rest_dispatch
i18n_urlpatterns = [
# Server admin (user_profile.is_staff) visible stats pages
path('activity', analytics.views.get_activity,
name='analytics.views.get_activity'),
path('activity/support', analytics.views.support,
name='analytics.views.support'),
path('realm_activity/<str:realm_str>/', analytics.views.get_realm_activity,
name='analytics.views.get_realm_activity'),
path('user_activity/<str:email>/', analytics.views.get_user_activity,
name='analytics.views.get_user_activity'),
url(r'^activity$', analytics.views.get_activity,
name='analytics.views.get_activity'),
url(r'^realm_activity/(?P<realm_str>[\S]+)/$', analytics.views.get_realm_activity,
name='analytics.views.get_realm_activity'),
url(r'^user_activity/(?P<email>[\S]+)/$', analytics.views.get_user_activity,
name='analytics.views.get_user_activity'),
path('stats/realm/<str:realm_str>/', analytics.views.stats_for_realm,
name='analytics.views.stats_for_realm'),
path('stats/installation', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
path('stats/remote/<int:remote_server_id>/installation',
analytics.views.stats_for_remote_installation,
name='analytics.views.stats_for_remote_installation'),
path('stats/remote/<int:remote_server_id>/realm/<int:remote_realm_id>/',
analytics.views.stats_for_remote_realm,
name='analytics.views.stats_for_remote_realm'),
url(r'^stats/realm/(?P<realm_str>[\S]+)/$', analytics.views.stats_for_realm,
name='analytics.views.stats_for_realm'),
url(r'^stats/installation$', analytics.views.stats_for_installation,
name='analytics.views.stats_for_installation'),
# User-visible stats page
path('stats', analytics.views.stats,
name='analytics.views.stats'),
url(r'^stats$', analytics.views.stats,
name='analytics.views.stats'),
]
# These endpoints are a part of the API (V1), which uses:
@@ -41,22 +32,17 @@ i18n_urlpatterns = [
# All of these paths are accessed by either a /json or /api prefix
v1_api_and_json_patterns = [
# get data for the graphs at /stats
path('analytics/chart_data', rest_dispatch,
{'GET': 'analytics.views.get_chart_data'}),
path('analytics/chart_data/realm/<str:realm_str>', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_realm'}),
path('analytics/chart_data/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/installation', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_installation'}),
path('analytics/chart_data/remote/<int:remote_server_id>/realm/<int:remote_realm_id>',
rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_remote_realm'}),
url(r'^analytics/chart_data$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data'}),
url(r'^analytics/chart_data/realm/(?P<realm_str>[\S]+)$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_realm'}),
url(r'^analytics/chart_data/installation$', rest_dispatch,
{'GET': 'analytics.views.get_chart_data_for_installation'}),
]
i18n_urlpatterns += [
path('api/v1/', include(v1_api_and_json_patterns)),
path('json/', include(v1_api_and_json_patterns)),
url(r'^api/v1/', include(v1_api_and_json_patterns)),
url(r'^json/', include(v1_api_and_json_patterns)),
]
urlpatterns = i18n_urlpatterns

File diff suppressed because it is too large Load Diff

View File

@@ -1,18 +0,0 @@
module.exports = {
presets: [
[
"@babel/preset-env",
{
corejs: 3,
loose: true, // Loose mode for…of loops are 5× faster in Firefox
useBuiltIns: "usage",
},
],
"@babel/typescript",
],
plugins: [
"@babel/proposal-class-properties",
["@babel/plugin-proposal-unicode-property-regex", { useUnicodeFlag: false }],
],
sourceType: "unambiguous",
};

View File

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
# Permission is hereby granted, free of charge, to any person obtaining a

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
from django.db import models, migrations
import django.utils.timezone
from django.db import migrations, models
class Migration(migrations.Migration):

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.10.4 on 2017-01-17 09:16
from django.db import migrations

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.2 on 2017-07-08 04:23
from django.db import migrations, models

View File

@@ -1,6 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2017-11-30 00:13
import django.db.models.deletion
from __future__ import unicode_literals
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,4 +1,6 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-29 18:39
from __future__ import unicode_literals
from django.db import migrations, models

View File

@@ -1,37 +0,0 @@
# Generated by Django 2.2.10 on 2020-03-27 09:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('confirmation', '0006_realmcreationkey_presume_email_valid'),
]
operations = [
migrations.AlterField(
model_name='confirmation',
name='confirmation_key',
field=models.CharField(db_index=True, max_length=40),
),
migrations.AlterField(
model_name='confirmation',
name='date_sent',
field=models.DateTimeField(db_index=True),
),
migrations.AlterField(
model_name='confirmation',
name='object_id',
field=models.PositiveIntegerField(db_index=True),
),
migrations.AlterField(
model_name='realmcreationkey',
name='creation_key',
field=models.CharField(db_index=True, max_length=40, verbose_name='activation key'),
),
migrations.AlterUniqueTogether(
name='confirmation',
unique_together={('type', 'confirmation_key')},
),
]

View File

@@ -1,24 +1,27 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
__revision__ = '$Id: models.py 28 2009-10-22 15:03:02Z jarek.zgoda $'
import datetime
import string
from random import SystemRandom
from typing import Mapping, Optional, Union
from urllib.parse import urljoin
from django.conf import settings
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
import datetime
from django.db import models
from django.db.models import CASCADE
from django.urls import reverse
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes.fields import GenericForeignKey
from django.http import HttpRequest, HttpResponse
from django.shortcuts import render
from django.urls import reverse
from django.utils.timezone import now as timezone_now
from zerver.models import EmailChangeStatus, MultiuseInvite, PreregistrationUser, Realm, UserProfile
from zerver.lib.utils import generate_random_token
from zerver.models import PreregistrationUser, EmailChangeStatus, MultiuseInvite, \
UserProfile, Realm
from random import SystemRandom
import string
from typing import Any, Dict, Optional, Union
class ConfirmationKeyException(Exception):
WRONG_LENGTH = 1
@@ -43,8 +46,7 @@ def generate_key() -> str:
ConfirmationObjT = Union[MultiuseInvite, PreregistrationUser, EmailChangeStatus]
def get_object_from_key(confirmation_key: str,
confirmation_type: int,
activate_object: bool=True) -> ConfirmationObjT:
confirmation_type: int) -> ConfirmationObjT:
# Confirmation keys used to be 40 characters
if len(confirmation_key) not in (24, 40):
raise ConfirmationKeyException(ConfirmationKeyException.WRONG_LENGTH)
@@ -59,42 +61,35 @@ def get_object_from_key(confirmation_key: str,
raise ConfirmationKeyException(ConfirmationKeyException.EXPIRED)
obj = confirmation.content_object
if activate_object and hasattr(obj, "status"):
if hasattr(obj, "status"):
obj.status = getattr(settings, 'STATUS_ACTIVE', 1)
obj.save(update_fields=['status'])
return obj
def create_confirmation_link(obj: ContentType,
def create_confirmation_link(obj: ContentType, host: str,
confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str:
url_args: Optional[Dict[str, str]]=None) -> str:
key = generate_key()
realm = None
if hasattr(obj, 'realm'):
realm = obj.realm
elif isinstance(obj, Realm):
realm = obj
Confirmation.objects.create(content_object=obj, date_sent=timezone_now(), confirmation_key=key,
realm=realm, type=confirmation_type)
return confirmation_url(key, realm, confirmation_type, url_args)
realm=obj.realm, type=confirmation_type)
return confirmation_url(key, host, confirmation_type, url_args)
def confirmation_url(confirmation_key: str, realm: Optional[Realm],
def confirmation_url(confirmation_key: str, host: str,
confirmation_type: int,
url_args: Mapping[str, str] = {}) -> str:
url_args = dict(url_args)
url_args: Optional[Dict[str, str]]=None) -> str:
if url_args is None:
url_args = {}
url_args['confirmation_key'] = confirmation_key
return urljoin(
settings.ROOT_DOMAIN_URI if realm is None else realm.uri,
reverse(_properties[confirmation_type].url_name, kwargs=url_args),
)
return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME, host,
reverse(_properties[confirmation_type].url_name, kwargs=url_args))
class Confirmation(models.Model):
content_type = models.ForeignKey(ContentType, on_delete=CASCADE)
object_id: int = models.PositiveIntegerField(db_index=True)
object_id = models.PositiveIntegerField() # type: int
content_object = GenericForeignKey('content_type', 'object_id')
date_sent: datetime.datetime = models.DateTimeField(db_index=True)
confirmation_key: str = models.CharField(max_length=40, db_index=True)
realm: Optional[Realm] = models.ForeignKey(Realm, null=True, on_delete=CASCADE)
date_sent = models.DateTimeField() # type: datetime.datetime
confirmation_key = models.CharField(max_length=40) # type: str
realm = models.ForeignKey(Realm, null=True, on_delete=CASCADE) # type: Optional[Realm]
# The following list is the set of valid types
USER_REGISTRATION = 1
@@ -104,14 +99,10 @@ class Confirmation(models.Model):
SERVER_REGISTRATION = 5
MULTIUSE_INVITE = 6
REALM_CREATION = 7
REALM_REACTIVATION = 8
type: int = models.PositiveSmallIntegerField()
type = models.PositiveSmallIntegerField() # type: int
def __str__(self) -> str:
return f'<Confirmation: {self.content_object}>'
class Meta:
unique_together = ("type", "confirmation_key")
return '<Confirmation: %s>' % (self.content_object,)
class ConfirmationType:
def __init__(self, url_name: str,
@@ -130,18 +121,8 @@ _properties = {
'zerver.views.registration.accounts_home_from_multiuse_invite',
validity_in_days=settings.INVITATION_LINK_VALIDITY_DAYS),
Confirmation.REALM_CREATION: ConfirmationType('check_prereg_key_and_redirect'),
Confirmation.REALM_REACTIVATION: ConfirmationType('zerver.views.realm.realm_reactivation'),
}
def one_click_unsubscribe_link(user_profile: UserProfile, email_type: str) -> str:
"""
Generate a unique link that a logged-out user can visit to unsubscribe from
Zulip e-mails without having to first log in.
"""
return create_confirmation_link(user_profile,
Confirmation.UNSUBSCRIBE,
url_args = {'email_type': email_type})
# Functions related to links generated by the generate_realm_creation_link.py
# management command.
# Note that being validated here will just allow the user to access the create_realm
@@ -168,18 +149,18 @@ def generate_realm_creation_url(by_admin: bool=False) -> str:
RealmCreationKey.objects.create(creation_key=key,
date_created=timezone_now(),
presume_email_valid=by_admin)
return urljoin(
settings.ROOT_DOMAIN_URI,
reverse('zerver.views.create_realm', kwargs={'creation_key': key}),
)
return '%s%s%s' % (settings.EXTERNAL_URI_SCHEME,
settings.EXTERNAL_HOST,
reverse('zerver.views.create_realm',
kwargs={'creation_key': key}))
class RealmCreationKey(models.Model):
creation_key = models.CharField('activation key', db_index=True, max_length=40)
creation_key = models.CharField('activation key', max_length=40)
date_created = models.DateTimeField('created', default=timezone_now)
# True just if we should presume the email address the user enters
# is theirs, and skip sending mail to it to confirm that.
presume_email_valid: bool = models.BooleanField(default=False)
presume_email_valid = models.BooleanField(default=False) # type: bool
class Invalid(Exception):
pass

View File

@@ -1,6 +1,9 @@
# -*- coding: utf-8 -*-
# Copyright: (c) 2008, Jarek Zgoda <jarek.zgoda@gmail.com>
from typing import Any, Dict
__revision__ = '$Id: settings.py 12 2008-11-23 19:38:52Z jarek.zgoda $'
STATUS_ACTIVE = 1
STATUS_REVOKED = 2

View File

@@ -1,32 +1,25 @@
import logging
import math
import os
from datetime import datetime, timedelta
from decimal import Decimal
import datetime
from functools import wraps
from typing import Callable, Dict, Optional, Tuple, TypeVar, cast
import stripe
import logging
import os
from typing import Any, Callable, Dict, Optional, TypeVar, Tuple
import ujson
from django.conf import settings
from django.core.signing import Signer
from django.db import transaction
from django.utils.timezone import now as timezone_now
from django.utils.translation import ugettext as _
from corporate.models import (
Customer,
CustomerPlan,
LicenseLedger,
get_current_plan_by_customer,
get_current_plan_by_realm,
get_customer_by_realm,
)
from django.conf import settings
from django.db import transaction
from django.utils.translation import ugettext as _
from django.utils.timezone import now as timezone_now
from django.core.signing import Signer
import stripe
from zerver.lib.exceptions import JsonableError
from zerver.lib.logging_util import log_to_file
from zerver.lib.timestamp import datetime_to_timestamp, timestamp_to_datetime
from zerver.lib.utils import generate_random_token
from zerver.models import Realm, RealmAuditLog, UserProfile
from zproject.config import get_secret
from zerver.lib.actions import do_change_plan_type
from zerver.models import Realm, UserProfile, RealmAuditLog
from corporate.models import Customer, Plan, Coupon, BillingProcessor
from zproject.settings import get_secret
STRIPE_PUBLISHABLE_KEY = get_secret('stripe_publishable_key')
stripe.api_key = get_secret('stripe_secret_key')
@@ -39,18 +32,30 @@ billing_logger = logging.getLogger('corporate.stripe')
log_to_file(billing_logger, BILLING_LOG_PATH)
log_to_file(logging.getLogger('stripe'), BILLING_LOG_PATH)
CallableT = TypeVar('CallableT', bound=Callable[..., object])
## Note: this is no longer accurate, as of when we added coupons
# To generate the fixture data in stripe_fixtures.json:
# * Set PRINT_STRIPE_FIXTURE_DATA to True
# * ./manage.py setup_stripe
# * Customer.objects.all().delete()
# * Log in as a user, and go to http://localhost:9991/upgrade/
# * Click Add card. Enter the following billing details:
# Name: Ada Starr, Street: Under the sea, City: Pacific,
# Zip: 33333, Country: United States
# Card number: 4242424242424242, Expiry: 03/33, CVV: 333
# * Click Make payment.
# * Copy out the 4 blobs of json from the dev console into stripe_fixtures.json.
# The contents of that file are '{\n' + concatenate the 4 json blobs + '\n}'.
# Then you can run e.g. `M-x mark-whole-buffer` and `M-x indent-region` in emacs
# to prettify the file (and make 4 space indents).
# * Copy out the customer id, plan id, and quantity values into
# corporate.tests.test_stripe.StripeTest.setUp.
# * Set PRINT_STRIPE_FIXTURE_DATA to False
PRINT_STRIPE_FIXTURE_DATA = False
MIN_INVOICED_LICENSES = 30
MAX_INVOICED_LICENSES = 1000
DEFAULT_INVOICE_DAYS_UNTIL_DUE = 30
CallableT = TypeVar('CallableT', bound=Callable[..., Any])
def get_latest_seat_count(realm: Realm) -> int:
non_guests = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False).exclude(role=UserProfile.ROLE_GUEST).count()
guests = UserProfile.objects.filter(
realm=realm, is_active=True, is_bot=False, role=UserProfile.ROLE_GUEST).count()
return max(non_guests, math.ceil(guests / 5))
def get_seat_count(realm: Realm) -> int:
return UserProfile.objects.filter(realm=realm, is_active=True, is_bot=False).count()
def sign_string(string: str) -> Tuple[str, str]:
salt = generate_random_token(64)
@@ -61,94 +66,13 @@ def unsign_string(signed_string: str, salt: str) -> str:
signer = Signer(salt=salt)
return signer.unsign(signed_string)
# Be extremely careful changing this function. Historical billing periods
# are not stored anywhere, and are just computed on the fly using this
# function. Any change you make here should return the same value (or be
# within a few seconds) for basically any value from when the billing system
# went online to within a year from now.
def add_months(dt: datetime, months: int) -> datetime:
assert(months >= 0)
# It's fine that the max day in Feb is 28 for leap years.
MAX_DAY_FOR_MONTH = {1: 31, 2: 28, 3: 31, 4: 30, 5: 31, 6: 30,
7: 31, 8: 31, 9: 30, 10: 31, 11: 30, 12: 31}
year = dt.year
month = dt.month + months
while month > 12:
year += 1
month -= 12
day = min(dt.day, MAX_DAY_FOR_MONTH[month])
# datetimes don't support leap seconds, so don't need to worry about those
return dt.replace(year=year, month=month, day=day)
def next_month(billing_cycle_anchor: datetime, dt: datetime) -> datetime:
estimated_months = round((dt - billing_cycle_anchor).days * 12. / 365)
for months in range(max(estimated_months - 1, 0), estimated_months + 2):
proposed_next_month = add_months(billing_cycle_anchor, months)
if 20 < (proposed_next_month - dt).days < 40:
return proposed_next_month
raise AssertionError('Something wrong in next_month calculation with '
f'billing_cycle_anchor: {billing_cycle_anchor}, dt: {dt}')
def start_of_next_billing_cycle(plan: CustomerPlan, event_time: datetime) -> datetime:
if plan.status == CustomerPlan.FREE_TRIAL:
assert(plan.next_invoice_date is not None) # for mypy
return plan.next_invoice_date
months_per_period = {
CustomerPlan.ANNUAL: 12,
CustomerPlan.MONTHLY: 1,
}[plan.billing_schedule]
periods = 1
dt = plan.billing_cycle_anchor
while dt <= event_time:
dt = add_months(plan.billing_cycle_anchor, months_per_period * periods)
periods += 1
return dt
def next_invoice_date(plan: CustomerPlan) -> Optional[datetime]:
if plan.status == CustomerPlan.ENDED:
return None
assert(plan.next_invoice_date is not None) # for mypy
months_per_period = {
CustomerPlan.ANNUAL: 12,
CustomerPlan.MONTHLY: 1,
}[plan.billing_schedule]
if plan.automanage_licenses:
months_per_period = 1
periods = 1
dt = plan.billing_cycle_anchor
while dt <= plan.next_invoice_date:
dt = add_months(plan.billing_cycle_anchor, months_per_period * periods)
periods += 1
return dt
def renewal_amount(plan: CustomerPlan, event_time: datetime) -> int: # nocoverage: TODO
if plan.fixed_price is not None:
return plan.fixed_price
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return 0
if last_ledger_entry.licenses_at_next_renewal is None:
return 0
if new_plan is not None:
plan = new_plan
assert(plan.price_per_license is not None) # for mypy
return plan.price_per_license * last_ledger_entry.licenses_at_next_renewal
def get_idempotency_key(ledger_entry: LicenseLedger) -> Optional[str]:
if settings.TEST_SUITE:
return None
return f'ledger_entry:{ledger_entry.id}' # nocoverage
class BillingError(Exception):
# error messages
CONTACT_SUPPORT = _("Something went wrong. Please contact {email}.").format(
email=settings.ZULIP_ADMINISTRATOR,
)
CONTACT_SUPPORT = _("Something went wrong. Please contact %s." % (settings.ZULIP_ADMINISTRATOR,))
TRY_RELOADING = _("Something went wrong. Please reload the page.")
# description is used only for tests
def __init__(self, description: str, message: str=CONTACT_SUPPORT) -> None:
def __init__(self, description: str, message: str) -> None:
self.description = description
self.message = message
@@ -160,11 +84,14 @@ class StripeConnectionError(BillingError):
def catch_stripe_errors(func: CallableT) -> CallableT:
@wraps(func)
def wrapped(*args: object, **kwargs: object) -> object:
def wrapped(*args: Any, **kwargs: Any) -> Any:
if settings.DEVELOPMENT and not settings.TEST_SUITE: # nocoverage
if STRIPE_PUBLISHABLE_KEY is None:
raise BillingError('missing stripe config', "Missing Stripe config. "
"See https://zulip.readthedocs.io/en/latest/subsystems/billing.html.")
if not Plan.objects.exists():
raise BillingError('missing plans',
"Plan objects not created. Please run ./manage.py setup_stripe")
try:
return func(*args, **kwargs)
# See https://stripe.com/docs/api/python#error_handling, though
@@ -172,36 +99,85 @@ def catch_stripe_errors(func: CallableT) -> CallableT:
# https://stripe.com/docs/error-codes gives a more detailed set of error codes
except stripe.error.StripeError as e:
err = e.json_body.get('error', {})
billing_logger.error(
"Stripe error: %s %s %s %s",
e.http_status, err.get('type'), err.get('code'), err.get('param'),
)
billing_logger.error("Stripe error: %s %s %s %s" % (
e.http_status, err.get('type'), err.get('code'), err.get('param')))
if isinstance(e, stripe.error.CardError):
# TODO: Look into i18n for this
raise StripeCardError('card error', err.get('message'))
if isinstance(e, (stripe.error.RateLimitError, stripe.error.APIConnectionError)): # nocoverage TODO
if isinstance(e, stripe.error.RateLimitError) or \
isinstance(e, stripe.error.APIConnectionError): # nocoverage TODO
raise StripeConnectionError(
'stripe connection error',
_("Something went wrong. Please wait a few seconds and try again."))
raise BillingError('other stripe error', BillingError.CONTACT_SUPPORT)
return cast(CallableT, wrapped)
return wrapped # type: ignore # https://github.com/python/mypy/issues/1927
@catch_stripe_errors
def stripe_get_customer(stripe_customer_id: str) -> stripe.Customer:
return stripe.Customer.retrieve(stripe_customer_id, expand=["default_source"])
stripe_customer = stripe.Customer.retrieve(stripe_customer_id, expand=["default_source"])
if PRINT_STRIPE_FIXTURE_DATA:
print(''.join(['"customer_with_subscription": ', str(stripe_customer), ','])) # nocoverage
return stripe_customer
@catch_stripe_errors
def do_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=None) -> Customer:
def stripe_get_upcoming_invoice(stripe_customer_id: str) -> stripe.Invoice:
stripe_invoice = stripe.Invoice.upcoming(customer=stripe_customer_id)
if PRINT_STRIPE_FIXTURE_DATA:
print(''.join(['"upcoming_invoice": ', str(stripe_invoice), ','])) # nocoverage
return stripe_invoice
@catch_stripe_errors
def stripe_get_invoice_preview_for_downgrade(
stripe_customer_id: str, stripe_subscription_id: str,
stripe_subscriptionitem_id: str) -> stripe.Invoice:
return stripe.Invoice.upcoming(
customer=stripe_customer_id, subscription=stripe_subscription_id,
subscription_items=[{'id': stripe_subscriptionitem_id, 'quantity': 0}])
def preview_invoice_total_for_downgrade(stripe_customer: stripe.Customer) -> int:
stripe_subscription = extract_current_subscription(stripe_customer)
if stripe_subscription is None:
# Most likely situation is: user A goes to billing page, user B
# cancels subscription, user A clicks on "downgrade" or something
# else that calls this function.
billing_logger.error("Trying to extract subscription item that doesn't exist, for Stripe customer %s"
% (stripe_customer.id,))
raise BillingError('downgrade without subscription', BillingError.TRY_RELOADING)
for item in stripe_subscription['items']:
# There should only be one item, but we can't index into stripe_subscription['items']
stripe_subscriptionitem_id = item.id
return stripe_get_invoice_preview_for_downgrade(
stripe_customer.id, stripe_subscription.id, stripe_subscriptionitem_id).total
# Return type should be Optional[stripe.Subscription], which throws a mypy error.
# Will fix once we add type stubs for the Stripe API.
def extract_current_subscription(stripe_customer: stripe.Customer) -> Any:
if not stripe_customer.subscriptions:
return None
for stripe_subscription in stripe_customer.subscriptions:
if stripe_subscription.status != "canceled":
return stripe_subscription
return None
@catch_stripe_errors
def do_create_customer(user: UserProfile, stripe_token: Optional[str]=None,
coupon: Optional[Coupon]=None) -> stripe.Customer:
realm = user.realm
stripe_coupon_id = None
if coupon is not None:
stripe_coupon_id = coupon.stripe_coupon_id
# We could do a better job of handling race conditions here, but if two
# people from a realm try to upgrade at exactly the same time, the main
# bad thing that will happen is that we will create an extra stripe
# customer that we can delete or ignore.
stripe_customer = stripe.Customer.create(
description=f"{realm.string_id} ({realm.name})",
email=user.delivery_email,
description="%s (%s)" % (realm.string_id, realm.name),
email=user.email,
metadata={'realm_id': realm.id, 'realm_str': realm.string_id},
source=stripe_token)
source=stripe_token,
coupon=stripe_coupon_id)
if PRINT_STRIPE_FIXTURE_DATA:
print(''.join(['"create_customer": ', str(stripe_customer), ','])) # nocoverage
event_time = timestamp_to_datetime(stripe_customer.created)
with transaction.atomic():
RealmAuditLog.objects.create(
@@ -211,400 +187,231 @@ def do_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=Non
RealmAuditLog.objects.create(
realm=user.realm, acting_user=user, event_type=RealmAuditLog.STRIPE_CARD_CHANGED,
event_time=event_time)
customer, created = Customer.objects.update_or_create(realm=realm, defaults={
'stripe_customer_id': stripe_customer.id})
Customer.objects.create(realm=realm, stripe_customer_id=stripe_customer.id)
user.is_billing_admin = True
user.save(update_fields=["is_billing_admin"])
return customer
return stripe_customer
@catch_stripe_errors
def do_replace_payment_source(user: UserProfile, stripe_token: str,
pay_invoices: bool=False) -> stripe.Customer:
customer = get_customer_by_realm(user.realm)
assert(customer is not None) # for mypy
stripe_customer = stripe_get_customer(customer.stripe_customer_id)
def do_replace_payment_source(user: UserProfile, stripe_token: str) -> stripe.Customer:
stripe_customer = stripe_get_customer(Customer.objects.get(realm=user.realm).stripe_customer_id)
stripe_customer.source = stripe_token
# Deletes existing card: https://stripe.com/docs/api#update_customer-source
# This can also have other side effects, e.g. it will try to pay certain past-due
# invoices: https://stripe.com/docs/api#update_customer
updated_stripe_customer = stripe.Customer.save(stripe_customer)
RealmAuditLog.objects.create(
realm=user.realm, acting_user=user, event_type=RealmAuditLog.STRIPE_CARD_CHANGED,
event_time=timezone_now())
if pay_invoices:
for stripe_invoice in stripe.Invoice.list(
billing='charge_automatically', customer=stripe_customer.id, status='open'):
# The user will get either a receipt or a "failed payment" email, but the in-app
# messaging could be clearer here (e.g. it could explicitly tell the user that there
# were payment(s) and that they succeeded or failed).
# Worth fixing if we notice that a lot of cards end up failing at this step.
stripe.Invoice.pay(stripe_invoice)
return updated_stripe_customer
# event_time should roughly be timezone_now(). Not designed to handle
# event_times in the past or future
@transaction.atomic
def make_end_of_cycle_updates_if_needed(plan: CustomerPlan,
event_time: datetime) -> Tuple[Optional[CustomerPlan], Optional[LicenseLedger]]:
last_ledger_entry = LicenseLedger.objects.filter(plan=plan).order_by('-id').first()
last_renewal = LicenseLedger.objects.filter(plan=plan, is_renewal=True) \
.order_by('-id').first().event_time
next_billing_cycle = start_of_next_billing_cycle(plan, last_renewal)
if next_billing_cycle <= event_time:
if plan.status == CustomerPlan.ACTIVE:
return None, LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.FREE_TRIAL:
plan.invoiced_through = last_ledger_entry
assert(plan.next_invoice_date is not None)
plan.billing_cycle_anchor = plan.next_invoice_date.replace(microsecond=0)
plan.status = CustomerPlan.ACTIVE
plan.save(update_fields=["invoiced_through", "billing_cycle_anchor", "status"])
return None, LicenseLedger.objects.create(
plan=plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal)
if plan.status == CustomerPlan.SWITCH_TO_ANNUAL_AT_END_OF_CYCLE:
if plan.fixed_price is not None: # nocoverage
raise NotImplementedError("Can't switch fixed priced monthly plan to annual.")
plan.status = CustomerPlan.ENDED
plan.save(update_fields=["status"])
discount = plan.customer.default_discount or plan.discount
_, _, _, price_per_license = compute_plan_parameters(
automanage_licenses=plan.automanage_licenses, billing_schedule=CustomerPlan.ANNUAL,
discount=plan.discount
)
new_plan = CustomerPlan.objects.create(
customer=plan.customer, billing_schedule=CustomerPlan.ANNUAL, automanage_licenses=plan.automanage_licenses,
charge_automatically=plan.charge_automatically, price_per_license=price_per_license,
discount=discount, billing_cycle_anchor=next_billing_cycle,
tier=plan.tier, status=CustomerPlan.ACTIVE, next_invoice_date=next_billing_cycle,
invoiced_through=None, invoicing_status=CustomerPlan.INITIAL_INVOICE_TO_BE_SENT,
)
new_plan_ledger_entry = LicenseLedger.objects.create(
plan=new_plan, is_renewal=True, event_time=next_billing_cycle,
licenses=last_ledger_entry.licenses_at_next_renewal,
licenses_at_next_renewal=last_ledger_entry.licenses_at_next_renewal
)
RealmAuditLog.objects.create(
realm=new_plan.customer.realm, event_time=event_time,
event_type=RealmAuditLog.CUSTOMER_SWITCHED_FROM_MONTHLY_TO_ANNUAL_PLAN,
extra_data=ujson.dumps({
"monthly_plan_id": plan.id,
"annual_plan_id": new_plan.id,
})
)
return new_plan, new_plan_ledger_entry
if plan.status == CustomerPlan.DOWNGRADE_AT_END_OF_CYCLE:
process_downgrade(plan)
return None, None
return None, last_ledger_entry
# Returns Customer instead of stripe_customer so that we don't make a Stripe
# API call if there's nothing to update
def update_or_create_stripe_customer(user: UserProfile, stripe_token: Optional[str]=None) -> Customer:
realm = user.realm
customer = get_customer_by_realm(realm)
if customer is None or customer.stripe_customer_id is None:
return do_create_stripe_customer(user, stripe_token=stripe_token)
if stripe_token is not None:
do_replace_payment_source(user, stripe_token)
return customer
def compute_plan_parameters(
automanage_licenses: bool, billing_schedule: int,
discount: Optional[Decimal],
free_trial: bool=False) -> Tuple[datetime, datetime, datetime, int]:
# Everything in Stripe is stored as timestamps with 1 second resolution,
# so standardize on 1 second resolution.
# TODO talk about leapseconds?
billing_cycle_anchor = timezone_now().replace(microsecond=0)
if billing_schedule == CustomerPlan.ANNUAL:
# TODO use variables to account for Zulip Plus
price_per_license = 8000
period_end = add_months(billing_cycle_anchor, 12)
elif billing_schedule == CustomerPlan.MONTHLY:
price_per_license = 800
period_end = add_months(billing_cycle_anchor, 1)
else:
raise AssertionError(f'Unknown billing_schedule: {billing_schedule}')
if discount is not None:
# There are no fractional cents in Stripe, so round down to nearest integer.
price_per_license = int(float(price_per_license * (1 - discount / 100)) + .00001)
next_invoice_date = period_end
if automanage_licenses:
next_invoice_date = add_months(billing_cycle_anchor, 1)
if free_trial:
period_end = billing_cycle_anchor + timedelta(days=settings.FREE_TRIAL_DAYS)
next_invoice_date = period_end
return billing_cycle_anchor, next_invoice_date, period_end, price_per_license
# Only used for cloud signups
@catch_stripe_errors
def process_initial_upgrade(user: UserProfile, licenses: int, automanage_licenses: bool,
billing_schedule: int, stripe_token: Optional[str]) -> None:
realm = user.realm
customer = update_or_create_stripe_customer(user, stripe_token=stripe_token)
charge_automatically = stripe_token is not None
free_trial = settings.FREE_TRIAL_DAYS not in (None, 0)
def do_replace_coupon(user: UserProfile, coupon: Coupon) -> stripe.Customer:
stripe_customer = stripe_get_customer(Customer.objects.get(realm=user.realm).stripe_customer_id)
stripe_customer.coupon = coupon.stripe_coupon_id
return stripe.Customer.save(stripe_customer)
if get_current_plan_by_customer(customer) is not None:
# Unlikely race condition from two people upgrading (clicking "Make payment")
# at exactly the same time. Doesn't fully resolve the race condition, but having
# a check here reduces the likelihood.
billing_logger.warning(
"Customer %s trying to upgrade, but has an active subscription", customer,
)
@catch_stripe_errors
def do_subscribe_customer_to_plan(user: UserProfile, stripe_customer: stripe.Customer, stripe_plan_id: str,
seat_count: int, tax_percent: float) -> None:
if extract_current_subscription(stripe_customer) is not None:
# Most likely due to two people in the org going to the billing page,
# and then both upgrading their plan. We don't send clients
# real-time event updates for the billing pages, so this is more
# likely than it would be in other parts of the app.
billing_logger.error("Stripe customer %s trying to subscribe to %s, "
"but has an active subscription" % (stripe_customer.id, stripe_plan_id))
raise BillingError('subscribing with existing subscription', BillingError.TRY_RELOADING)
customer = Customer.objects.get(stripe_customer_id=stripe_customer.id)
# Note that there is a race condition here, where if two users upgrade at exactly the
# same time, they will have two subscriptions, and get charged twice. We could try to
# reduce the chance of it with a well-designed idempotency_key, but it's not easy since
# we also need to be careful not to block the customer from retrying if their
# subscription attempt fails (e.g. due to insufficient funds).
billing_cycle_anchor, next_invoice_date, period_end, price_per_license = compute_plan_parameters(
automanage_licenses, billing_schedule, customer.default_discount, free_trial)
# The main design constraint in this function is that if you upgrade with a credit card, and the
# charge fails, everything should be rolled back as if nothing had happened. This is because we
# expect frequent card failures on initial signup.
# Hence, if we're going to charge a card, do it at the beginning, even if we later may have to
# adjust the number of licenses.
if charge_automatically:
if not free_trial:
stripe_charge = stripe.Charge.create(
amount=price_per_license * licenses,
currency='usd',
customer=customer.stripe_customer_id,
description=f"Upgrade to Zulip Standard, ${price_per_license/100} x {licenses}",
receipt_email=user.delivery_email,
statement_descriptor='Zulip Standard')
# Not setting a period start and end, but maybe we should? Unclear what will make things
# most similar to the renewal case from an accounting perspective.
assert isinstance(stripe_charge.source, stripe.Card)
description = f"Payment (Card ending in {stripe_charge.source.last4})"
stripe.InvoiceItem.create(
amount=price_per_license * licenses * -1,
currency='usd',
customer=customer.stripe_customer_id,
description=description,
discountable=False)
# TODO: The correctness of this relies on user creation, deactivation, etc being
# in a transaction.atomic() with the relevant RealmAuditLog entries
# Success here implies the stripe_customer was charged: https://stripe.com/docs/billing/lifecycle#active
# Otherwise we should expect it to throw a stripe.error.
stripe_subscription = stripe.Subscription.create(
customer=stripe_customer.id,
billing='charge_automatically',
items=[{
'plan': stripe_plan_id,
'quantity': seat_count,
}],
prorate=True,
tax_percent=tax_percent)
if PRINT_STRIPE_FIXTURE_DATA:
print(''.join(['"create_subscription": ', str(stripe_subscription), ','])) # nocoverage
with transaction.atomic():
# billed_licenses can greater than licenses if users are added between the start of
# this function (process_initial_upgrade) and now
billed_licenses = max(get_latest_seat_count(realm), licenses)
plan_params = {
'automanage_licenses': automanage_licenses,
'charge_automatically': charge_automatically,
'price_per_license': price_per_license,
'discount': customer.default_discount,
'billing_cycle_anchor': billing_cycle_anchor,
'billing_schedule': billing_schedule,
'tier': CustomerPlan.STANDARD}
if free_trial:
plan_params['status'] = CustomerPlan.FREE_TRIAL
plan = CustomerPlan.objects.create(
customer=customer,
next_invoice_date=next_invoice_date,
**plan_params)
ledger_entry = LicenseLedger.objects.create(
plan=plan,
is_renewal=True,
event_time=billing_cycle_anchor,
licenses=billed_licenses,
licenses_at_next_renewal=billed_licenses)
plan.invoiced_through = ledger_entry
plan.save(update_fields=['invoiced_through'])
customer.has_billing_relationship = True
customer.save(update_fields=['has_billing_relationship'])
customer.realm.has_seat_based_plan = True
customer.realm.save(update_fields=['has_seat_based_plan'])
RealmAuditLog.objects.create(
realm=realm, acting_user=user, event_time=billing_cycle_anchor,
event_type=RealmAuditLog.CUSTOMER_PLAN_CREATED,
extra_data=ujson.dumps(plan_params))
realm=customer.realm,
acting_user=user,
event_type=RealmAuditLog.STRIPE_PLAN_CHANGED,
event_time=timestamp_to_datetime(stripe_subscription.created),
extra_data=ujson.dumps({'plan': stripe_plan_id, 'quantity': seat_count}))
if not free_trial:
stripe.InvoiceItem.create(
currency='usd',
customer=customer.stripe_customer_id,
description='Zulip Standard',
discountable=False,
period = {'start': datetime_to_timestamp(billing_cycle_anchor),
'end': datetime_to_timestamp(period_end)},
quantity=billed_licenses,
unit_amount=price_per_license)
current_seat_count = get_seat_count(customer.realm)
if seat_count != current_seat_count:
RealmAuditLog.objects.create(
realm=customer.realm,
event_type=RealmAuditLog.STRIPE_PLAN_QUANTITY_RESET,
event_time=timestamp_to_datetime(stripe_subscription.created),
requires_billing_update=True,
extra_data=ujson.dumps({'quantity': current_seat_count}))
if charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
from zerver.lib.actions import do_change_plan_type
do_change_plan_type(realm, Realm.STANDARD)
def update_license_ledger_for_automanaged_plan(realm: Realm, plan: CustomerPlan,
event_time: datetime) -> None:
new_plan, last_ledger_entry = make_end_of_cycle_updates_if_needed(plan, event_time)
if last_ledger_entry is None:
return
if new_plan is not None:
plan = new_plan
licenses_at_next_renewal = get_latest_seat_count(realm)
licenses = max(licenses_at_next_renewal, last_ledger_entry.licenses)
LicenseLedger.objects.create(
plan=plan, event_time=event_time, licenses=licenses,
licenses_at_next_renewal=licenses_at_next_renewal)
def update_license_ledger_if_needed(realm: Realm, event_time: datetime) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None:
return
if not plan.automanage_licenses:
return
update_license_ledger_for_automanaged_plan(realm, plan, event_time)
def invoice_plan(plan: CustomerPlan, event_time: datetime) -> None:
if plan.invoicing_status == CustomerPlan.STARTED:
raise NotImplementedError('Plan with invoicing_status==STARTED needs manual resolution.')
make_end_of_cycle_updates_if_needed(plan, event_time)
if plan.invoicing_status == CustomerPlan.INITIAL_INVOICE_TO_BE_SENT:
invoiced_through_id = -1
licenses_base = None
def process_initial_upgrade(user: UserProfile, plan: Plan, seat_count: int, stripe_token: str) -> None:
customer = Customer.objects.filter(realm=user.realm).first()
if customer is None:
stripe_customer = do_create_customer(user, stripe_token=stripe_token)
else:
assert(plan.invoiced_through is not None)
licenses_base = plan.invoiced_through.licenses
invoiced_through_id = plan.invoiced_through.id
stripe_customer = do_replace_payment_source(user, stripe_token)
do_subscribe_customer_to_plan(
user=user,
stripe_customer=stripe_customer,
stripe_plan_id=plan.stripe_plan_id,
seat_count=seat_count,
# TODO: billing address details are passed to us in the request;
# use that to calculate taxes.
tax_percent=0)
do_change_plan_type(user, Realm.STANDARD)
invoice_item_created = False
for ledger_entry in LicenseLedger.objects.filter(plan=plan, id__gt=invoiced_through_id,
event_time__lte=event_time).order_by('id'):
price_args: Dict[str, int] = {}
if ledger_entry.is_renewal:
if plan.fixed_price is not None:
price_args = {'amount': plan.fixed_price}
def attach_discount_to_realm(user: UserProfile, percent_off: int) -> None:
coupon = Coupon.objects.get(percent_off=percent_off)
customer = Customer.objects.filter(realm=user.realm).first()
if customer is None:
do_create_customer(user, coupon=coupon)
else:
do_replace_coupon(user, coupon)
@catch_stripe_errors
def process_downgrade(user: UserProfile) -> None:
stripe_customer = stripe_get_customer(
Customer.objects.filter(realm=user.realm).first().stripe_customer_id)
subscription_balance = preview_invoice_total_for_downgrade(stripe_customer)
# If subscription_balance > 0, they owe us money. This is likely due to
# people they added in the last day, so we can just forgive it.
# Stripe automatically forgives it when we delete the subscription, so nothing we need to do there.
if subscription_balance < 0:
stripe_customer.account_balance = stripe_customer.account_balance + subscription_balance
stripe_subscription = extract_current_subscription(stripe_customer)
# Wish these two could be transaction.atomic
stripe_subscription = stripe_subscription.delete()
stripe.Customer.save(stripe_customer)
with transaction.atomic():
user.realm.has_seat_based_plan = False
user.realm.save(update_fields=['has_seat_based_plan'])
RealmAuditLog.objects.create(
realm=user.realm,
acting_user=user,
event_type=RealmAuditLog.STRIPE_PLAN_CHANGED,
event_time=timestamp_to_datetime(stripe_subscription.canceled_at),
extra_data=ujson.dumps({'plan': None, 'quantity': stripe_subscription.quantity}))
# Doing this last, since it results in user-visible confirmation (via
# product changes) that the downgrade succeeded.
# Keeping it out of the transaction.atomic block because it will
# eventually have a lot of stuff going on.
do_change_plan_type(user, Realm.LIMITED)
## Process RealmAuditLog
def do_set_subscription_quantity(
customer: Customer, timestamp: int, idempotency_key: str, quantity: int) -> None:
stripe_customer = stripe_get_customer(customer.stripe_customer_id)
stripe_subscription = extract_current_subscription(stripe_customer)
stripe_subscription.quantity = quantity
stripe_subscription.proration_date = timestamp
stripe_subscription.save(idempotency_key=idempotency_key)
def do_adjust_subscription_quantity(
customer: Customer, timestamp: int, idempotency_key: str, delta: int) -> None:
stripe_customer = stripe_get_customer(customer.stripe_customer_id)
stripe_subscription = extract_current_subscription(stripe_customer)
stripe_subscription.quantity = stripe_subscription.quantity + delta
stripe_subscription.proration_date = timestamp
stripe_subscription.save(idempotency_key=idempotency_key)
def increment_subscription_quantity(
customer: Customer, timestamp: int, idempotency_key: str) -> None:
return do_adjust_subscription_quantity(customer, timestamp, idempotency_key, 1)
def decrement_subscription_quantity(
customer: Customer, timestamp: int, idempotency_key: str) -> None:
return do_adjust_subscription_quantity(customer, timestamp, idempotency_key, -1)
@catch_stripe_errors
def process_billing_log_entry(processor: BillingProcessor, log_row: RealmAuditLog) -> None:
processor.state = BillingProcessor.STARTED
processor.log_row = log_row
processor.save()
customer = Customer.objects.get(realm=log_row.realm)
timestamp = datetime_to_timestamp(log_row.event_time)
idempotency_key = 'process_billing_log_entry:%s' % (log_row.id,)
extra_args = {} # type: Dict[str, Any]
if log_row.extra_data is not None:
extra_args = ujson.loads(log_row.extra_data)
processing_functions = {
RealmAuditLog.STRIPE_PLAN_QUANTITY_RESET: do_set_subscription_quantity,
RealmAuditLog.USER_CREATED: increment_subscription_quantity,
RealmAuditLog.USER_ACTIVATED: increment_subscription_quantity,
RealmAuditLog.USER_DEACTIVATED: decrement_subscription_quantity,
RealmAuditLog.USER_REACTIVATED: increment_subscription_quantity,
} # type: Dict[str, Callable[..., None]]
processing_functions[log_row.event_type](customer, timestamp, idempotency_key, **extra_args)
processor.state = BillingProcessor.DONE
processor.save()
def get_next_billing_log_entry(processor: BillingProcessor) -> Optional[RealmAuditLog]:
if processor.state == BillingProcessor.STARTED:
return processor.log_row
assert processor.state != BillingProcessor.STALLED
if processor.state not in [BillingProcessor.DONE, BillingProcessor.SKIPPED]:
raise BillingError(
'unknown processor state',
"Check for typos, since this value is sometimes set by hand: %s" % (processor.state,))
if processor.realm is None:
realms_with_processors = BillingProcessor.objects.exclude(
realm=None).values_list('realm', flat=True)
query = RealmAuditLog.objects.exclude(realm__in=realms_with_processors)
else:
global_processor = BillingProcessor.objects.get(realm=None)
query = RealmAuditLog.objects.filter(
realm=processor.realm, id__lt=global_processor.log_row.id)
return query.filter(id__gt=processor.log_row.id,
requires_billing_update=True).order_by('id').first()
def run_billing_processor_one_step(processor: BillingProcessor) -> bool:
# Returns True if a row was processed, or if processing was attempted
log_row = get_next_billing_log_entry(processor)
if log_row is None:
if processor.realm is not None:
processor.delete()
return False
try:
process_billing_log_entry(processor, log_row)
return True
except Exception as e:
# Possible errors include processing subscription quantity entries
# after downgrade, since the downgrade code doesn't check that
# billing processor is up to date
billing_logger.error("Error on log_row.realm=%s, event_type=%s, log_row.id=%s, "
"processor.id=%s, processor.realm=%s" % (
processor.log_row.realm.string_id, processor.log_row.event_type,
processor.log_row.id, processor.id, processor.realm))
if isinstance(e, StripeCardError):
if processor.realm is None:
BillingProcessor.objects.create(log_row=processor.log_row,
realm=processor.log_row.realm,
state=BillingProcessor.STALLED)
processor.state = BillingProcessor.SKIPPED
else:
assert(plan.price_per_license is not None) # needed for mypy
price_args = {'unit_amount': plan.price_per_license,
'quantity': ledger_entry.licenses}
description = "Zulip Standard - renewal"
elif licenses_base is not None and ledger_entry.licenses != licenses_base:
assert(plan.price_per_license)
last_renewal = LicenseLedger.objects.filter(
plan=plan, is_renewal=True, event_time__lte=ledger_entry.event_time) \
.order_by('-id').first().event_time
period_end = start_of_next_billing_cycle(plan, ledger_entry.event_time)
proration_fraction = (period_end - ledger_entry.event_time) / (period_end - last_renewal)
price_args = {'unit_amount': int(plan.price_per_license * proration_fraction + .5),
'quantity': ledger_entry.licenses - licenses_base}
description = "Additional license ({} - {})".format(
ledger_entry.event_time.strftime('%b %-d, %Y'), period_end.strftime('%b %-d, %Y'))
if price_args:
plan.invoiced_through = ledger_entry
plan.invoicing_status = CustomerPlan.STARTED
plan.save(update_fields=['invoicing_status', 'invoiced_through'])
stripe.InvoiceItem.create(
currency='usd',
customer=plan.customer.stripe_customer_id,
description=description,
discountable=False,
period = {'start': datetime_to_timestamp(ledger_entry.event_time),
'end': datetime_to_timestamp(
start_of_next_billing_cycle(plan, ledger_entry.event_time))},
idempotency_key=get_idempotency_key(ledger_entry),
**price_args)
invoice_item_created = True
plan.invoiced_through = ledger_entry
plan.invoicing_status = CustomerPlan.DONE
plan.save(update_fields=['invoicing_status', 'invoiced_through'])
licenses_base = ledger_entry.licenses
if invoice_item_created:
if plan.charge_automatically:
billing_method = 'charge_automatically'
days_until_due = None
else:
billing_method = 'send_invoice'
days_until_due = DEFAULT_INVOICE_DAYS_UNTIL_DUE
stripe_invoice = stripe.Invoice.create(
auto_advance=True,
billing=billing_method,
customer=plan.customer.stripe_customer_id,
days_until_due=days_until_due,
statement_descriptor='Zulip Standard')
stripe.Invoice.finalize_invoice(stripe_invoice)
plan.next_invoice_date = next_invoice_date(plan)
plan.save(update_fields=['next_invoice_date'])
def invoice_plans_as_needed(event_time: datetime=timezone_now()) -> None:
for plan in CustomerPlan.objects.filter(next_invoice_date__lte=event_time):
invoice_plan(plan, event_time)
def attach_discount_to_realm(realm: Realm, discount: Decimal) -> None:
Customer.objects.update_or_create(realm=realm, defaults={'default_discount': discount})
def update_sponsorship_status(realm: Realm, sponsorship_pending: bool) -> None:
customer, _ = Customer.objects.get_or_create(realm=realm)
customer.sponsorship_pending = sponsorship_pending
customer.save(update_fields=["sponsorship_pending"])
def get_discount_for_realm(realm: Realm) -> Optional[Decimal]:
customer = get_customer_by_realm(realm)
if customer is not None:
return customer.default_discount
return None
def do_change_plan_status(plan: CustomerPlan, status: int) -> None:
plan.status = status
plan.save(update_fields=['status'])
billing_logger.info(
'Change plan status: Customer.id: %s, CustomerPlan.id: %s, status: %s',
plan.customer.id, plan.id, status,
)
def process_downgrade(plan: CustomerPlan) -> None:
from zerver.lib.actions import do_change_plan_type
do_change_plan_type(plan.customer.realm, Realm.LIMITED)
plan.status = CustomerPlan.ENDED
plan.save(update_fields=['status'])
def estimate_annual_recurring_revenue_by_realm() -> Dict[str, int]: # nocoverage
annual_revenue = {}
for plan in CustomerPlan.objects.filter(
status=CustomerPlan.ACTIVE).select_related('customer__realm'):
# TODO: figure out what to do for plans that don't automatically
# renew, but which probably will renew
renewal_cents = renewal_amount(plan, timezone_now())
if plan.billing_schedule == CustomerPlan.MONTHLY:
renewal_cents *= 12
# TODO: Decimal stuff
annual_revenue[plan.customer.realm.string_id] = int(renewal_cents / 100)
return annual_revenue
# During realm deactivation we instantly downgrade the plan to Limited.
# Extra users added in the final month are not charged. Also used
# for the cancelation of Free Trial.
def downgrade_now(realm: Realm) -> None:
plan = get_current_plan_by_realm(realm)
if plan is None:
return
process_downgrade(plan)
plan.invoiced_through = LicenseLedger.objects.filter(plan=plan).order_by('id').last()
plan.next_invoice_date = next_invoice_date(plan)
plan.save(update_fields=["invoiced_through", "next_invoice_date"])
processor.state = BillingProcessor.STALLED
processor.save()
return True
raise

View File

@@ -0,0 +1,47 @@
"""\
Run BillingProcessors.
This management command is run via supervisor. Do not run on multiple
machines, as the code has not been made robust to race conditions from doing
so. (Alternatively, you can set `BILLING_PROCESSOR_ENABLED=False` on all but
one machine to make the command have no effect.)
"""
import time
from typing import Any
from django.conf import settings
from django.core.management.base import BaseCommand
from zerver.lib.context_managers import lockfile
from zerver.lib.management import sleep_forever
from corporate.lib.stripe import StripeConnectionError, \
run_billing_processor_one_step
from corporate.models import BillingProcessor
class Command(BaseCommand):
help = """Run BillingProcessors, to sync billing-relevant updates into Stripe.
Run this command under supervisor.
Usage: ./manage.py process_billing_updates
"""
def handle(self, *args: Any, **options: Any) -> None:
if not settings.BILLING_PROCESSOR_ENABLED:
sleep_forever()
with lockfile("/tmp/zulip_billing_processor.lockfile"):
while True:
for processor in BillingProcessor.objects.exclude(
state=BillingProcessor.STALLED):
try:
entry_processed = run_billing_processor_one_step(processor)
except StripeConnectionError:
time.sleep(5*60)
# Less load on the db during times of activity
# and more responsiveness when the load is low
if entry_processed:
time.sleep(10)
else:
time.sleep(2)

View File

@@ -0,0 +1,55 @@
from zerver.lib.management import ZulipBaseCommand
from corporate.models import Plan, Coupon, Customer
from zproject.settings import get_secret
from typing import Any
import stripe
stripe.api_key = get_secret('stripe_secret_key')
class Command(ZulipBaseCommand):
help = """Script to add the appropriate products and plans to Stripe."""
def handle(self, *args: Any, **options: Any) -> None:
Customer.objects.all().delete()
Plan.objects.all().delete()
Coupon.objects.all().delete()
# Zulip Cloud offerings
product = stripe.Product.create(
name="Zulip Cloud Standard",
type='service',
statement_descriptor="Zulip Cloud Standard",
unit_label="user")
plan = stripe.Plan.create(
currency='usd',
interval='month',
product=product.id,
amount=800,
billing_scheme='per_unit',
nickname=Plan.CLOUD_MONTHLY,
usage_type='licensed')
Plan.objects.create(nickname=Plan.CLOUD_MONTHLY, stripe_plan_id=plan.id)
plan = stripe.Plan.create(
currency='usd',
interval='year',
product=product.id,
amount=8000,
billing_scheme='per_unit',
nickname=Plan.CLOUD_ANNUAL,
usage_type='licensed')
Plan.objects.create(nickname=Plan.CLOUD_ANNUAL, stripe_plan_id=plan.id)
coupon = stripe.Coupon.create(
duration='forever',
name='25% discount',
percent_off=25)
Coupon.objects.create(percent_off=25, stripe_coupon_id=coupon.id)
coupon = stripe.Coupon.create(
duration='forever',
name='85% discount',
percent_off=85)
Coupon.objects.create(percent_off=85, stripe_coupon_id=coupon.id)

View File

@@ -1,7 +1,9 @@
# -*- coding: utf-8 -*-
# Generated by Django 1.11.14 on 2018-09-25 12:02
from __future__ import unicode_literals
import django.db.models.deletion
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.16 on 2018-12-12 20:19
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='customer',
name='default_discount',
field=models.DecimalField(decimal_places=4, max_digits=7, null=True),
),
]

View File

@@ -1,33 +0,0 @@
# Generated by Django 1.11.16 on 2018-12-22 21:05
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0002_customer_default_discount'),
]
operations = [
migrations.CreateModel(
name='CustomerPlan',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('licenses', models.IntegerField()),
('automanage_licenses', models.BooleanField(default=False)),
('charge_automatically', models.BooleanField(default=False)),
('price_per_license', models.IntegerField(null=True)),
('fixed_price', models.IntegerField(null=True)),
('discount', models.DecimalField(decimal_places=4, max_digits=6, null=True)),
('billing_cycle_anchor', models.DateTimeField()),
('billing_schedule', models.SmallIntegerField()),
('billed_through', models.DateTimeField()),
('next_billing_date', models.DateTimeField(db_index=True)),
('tier', models.SmallIntegerField()),
('status', models.SmallIntegerField(default=1)),
('customer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.Customer')),
],
),
]

View File

@@ -1,25 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-19 05:01
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0003_customerplan'),
]
operations = [
migrations.CreateModel(
name='LicenseLedger',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('is_renewal', models.BooleanField(default=False)),
('event_time', models.DateTimeField()),
('licenses', models.IntegerField()),
('licenses_at_next_renewal', models.IntegerField(null=True)),
('plan', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='corporate.CustomerPlan')),
],
),
]

View File

@@ -1,33 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-28 13:04
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0004_licenseledger'),
]
operations = [
migrations.RenameField(
model_name='customerplan',
old_name='next_billing_date',
new_name='next_invoice_date',
),
migrations.RemoveField(
model_name='customerplan',
name='billed_through',
),
migrations.AddField(
model_name='customerplan',
name='invoiced_through',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='+', to='corporate.LicenseLedger'),
),
migrations.AddField(
model_name='customerplan',
name='invoicing_status',
field=models.SmallIntegerField(default=1),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-29 01:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0005_customerplan_invoicing'),
]
operations = [
migrations.AlterField(
model_name='customer',
name='stripe_customer_id',
field=models.CharField(max_length=255, null=True, unique=True),
),
]

View File

@@ -1,38 +0,0 @@
# Generated by Django 1.11.18 on 2019-01-31 22:16
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('corporate', '0006_nullable_stripe_customer_id'),
]
operations = [
migrations.RemoveField(
model_name='billingprocessor',
name='log_row',
),
migrations.RemoveField(
model_name='billingprocessor',
name='realm',
),
migrations.DeleteModel(
name='Coupon',
),
migrations.DeleteModel(
name='Plan',
),
migrations.RemoveField(
model_name='customer',
name='has_billing_relationship',
),
migrations.RemoveField(
model_name='customerplan',
name='licenses',
),
migrations.DeleteModel(
name='BillingProcessor',
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 1.11.20 on 2019-04-11 00:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0007_remove_deprecated_fields'),
]
operations = [
migrations.AlterField(
model_name='customerplan',
name='next_invoice_date',
field=models.DateTimeField(db_index=True, null=True),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 2.2.13 on 2020-06-09 12:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('corporate', '0008_nullable_next_invoice_date'),
]
operations = [
migrations.AddField(
model_name='customer',
name='sponsorship_pending',
field=models.BooleanField(default=False),
),
]

View File

@@ -1,87 +1,46 @@
import datetime
from decimal import Decimal
from typing import Optional
from django.db import models
from django.db.models import CASCADE
from zerver.models import Realm
from zerver.models import Realm, RealmAuditLog
class Customer(models.Model):
realm: Realm = models.OneToOneField(Realm, on_delete=CASCADE)
stripe_customer_id: str = models.CharField(max_length=255, null=True, unique=True)
sponsorship_pending: bool = models.BooleanField(default=False)
# A percentage, like 85.
default_discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=7, null=True)
realm = models.OneToOneField(Realm, on_delete=models.CASCADE) # type: Realm
stripe_customer_id = models.CharField(max_length=255, unique=True) # type: str
# Becomes True the first time a payment successfully goes through, and never
# goes back to being False
has_billing_relationship = models.BooleanField(default=False) # type: bool
def __str__(self) -> str:
return f"<Customer {self.realm} {self.stripe_customer_id}>"
return "<Customer %s %s>" % (self.realm, self.stripe_customer_id)
def get_customer_by_realm(realm: Realm) -> Optional[Customer]:
return Customer.objects.filter(realm=realm).first()
class Plan(models.Model):
# The two possible values for nickname
CLOUD_MONTHLY = 'monthly'
CLOUD_ANNUAL = 'annual'
nickname = models.CharField(max_length=40, unique=True) # type: str
class CustomerPlan(models.Model):
customer: Customer = models.ForeignKey(Customer, on_delete=CASCADE)
automanage_licenses: bool = models.BooleanField(default=False)
charge_automatically: bool = models.BooleanField(default=False)
stripe_plan_id = models.CharField(max_length=255, unique=True) # type: str
# Both of these are in cents. Exactly one of price_per_license or
# fixed_price should be set. fixed_price is only for manual deals, and
# can't be set via the self-serve billing system.
price_per_license: Optional[int] = models.IntegerField(null=True)
fixed_price: Optional[int] = models.IntegerField(null=True)
class Coupon(models.Model):
percent_off = models.SmallIntegerField(unique=True) # type: int
stripe_coupon_id = models.CharField(max_length=255, unique=True) # type: str
# Discount that was applied. For display purposes only.
discount: Optional[Decimal] = models.DecimalField(decimal_places=4, max_digits=6, null=True)
def __str__(self) -> str:
return '<Coupon: %s %s %s>' % (self.percent_off, self.stripe_coupon_id, self.id)
billing_cycle_anchor: datetime.datetime = models.DateTimeField()
ANNUAL = 1
MONTHLY = 2
billing_schedule: int = models.SmallIntegerField()
class BillingProcessor(models.Model):
log_row = models.ForeignKey(RealmAuditLog, on_delete=models.CASCADE) # RealmAuditLog
# Exactly one processor, the global processor, has realm=None.
realm = models.OneToOneField(Realm, null=True, on_delete=models.CASCADE) # type: Realm
next_invoice_date: Optional[datetime.datetime] = models.DateTimeField(db_index=True, null=True)
invoiced_through: Optional["LicenseLedger"] = models.ForeignKey(
'LicenseLedger', null=True, on_delete=CASCADE, related_name='+')
DONE = 1
STARTED = 2
INITIAL_INVOICE_TO_BE_SENT = 3
invoicing_status: int = models.SmallIntegerField(default=DONE)
DONE = 'done'
STARTED = 'started'
SKIPPED = 'skipped' # global processor only
STALLED = 'stalled' # realm processors only
state = models.CharField(max_length=20) # type: str
STANDARD = 1
PLUS = 2 # not available through self-serve signup
ENTERPRISE = 10
tier: int = models.SmallIntegerField()
last_modified = models.DateTimeField(auto_now=True) # type: datetime.datetime
ACTIVE = 1
DOWNGRADE_AT_END_OF_CYCLE = 2
FREE_TRIAL = 3
SWITCH_TO_ANNUAL_AT_END_OF_CYCLE = 4
# "Live" plans should have a value < LIVE_STATUS_THRESHOLD.
# There should be at most one live plan per customer.
LIVE_STATUS_THRESHOLD = 10
ENDED = 11
NEVER_STARTED = 12
status: int = models.SmallIntegerField(default=ACTIVE)
# TODO maybe override setattr to ensure billing_cycle_anchor, etc are immutable
def get_current_plan_by_customer(customer: Customer) -> Optional[CustomerPlan]:
return CustomerPlan.objects.filter(
customer=customer, status__lt=CustomerPlan.LIVE_STATUS_THRESHOLD).first()
def get_current_plan_by_realm(realm: Realm) -> Optional[CustomerPlan]:
customer = get_customer_by_realm(realm)
if customer is None:
return None
return get_current_plan_by_customer(customer)
class LicenseLedger(models.Model):
plan: CustomerPlan = models.ForeignKey(CustomerPlan, on_delete=CASCADE)
# Also True for the initial upgrade.
is_renewal: bool = models.BooleanField(default=False)
event_time: datetime.datetime = models.DateTimeField()
licenses: int = models.IntegerField()
# None means the plan does not automatically renew.
# This cannot be None if plan.automanage_licenses.
licenses_at_next_renewal: Optional[int] = models.IntegerField(null=True)
def __str__(self) -> str:
return '<BillingProcessor: %s %s %s>' % (self.realm, self.log_row, self.id)

View File

@@ -0,0 +1,411 @@
{
"create_customer": {
"account_balance": 0,
"created": 1529990750,
"currency": null,
"default_source": "card_1Ch9gVGh0CmXqmnwv94RombT",
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": {
"coupon": {
"amount_off": null,
"created": 1535002820,
"currency": null,
"duration": "forever",
"duration_in_months": null,
"id": "rncBblSZ",
"livemode": false,
"max_redemptions": null,
"metadata": {},
"name": "85% discount",
"object": "coupon",
"percent_off": 85.0,
"redeem_by": null,
"times_redeemed": 1,
"valid": true
},
"customer": "cus_DT7pd3yW0w8lF1",
"end": null,
"object": "discount",
"start": 1535004909,
"subscription": null
},
"email": "hamlet@zulip.com",
"id": "cus_D7OT2jf5YAtZQL",
"invoice_prefix": "23ABC45",
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"object": "customer",
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea",
"address_line1_check": "pass",
"address_line2": null,
"address_state": "FL",
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_D7OT2jf5YAtZQL",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "6dAXT9VZvwro65EK",
"funding": "credit",
"id": "card_1Ch9gVGh0CmXqmnwv94RombT",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_D7OT2jf5YAtZQL/sources"
},
"subscriptions": {}
},
"create_subscription": {
"application_fee_percent": null,
"billing": "charge_automatically",
"billing_cycle_anchor": 1529990751,
"cancel_at_period_end": false,
"canceled_at": null,
"created": 1529990751,
"current_period_end": 1561526751,
"current_period_start": 1529990751,
"customer": "cus_D7OT2jf5YAtZQL",
"days_until_due": null,
"discount": null,
"ended_at": null,
"id": "sub_D7OTT8FZbOPxah",
"items": {
"data": [
{
"created": 1529990751,
"id": "si_D7OTEItF5ZLN2R",
"metadata": {},
"object": "subscription_item",
"plan": {
"active": true,
"aggregate_usage": null,
"amount": 8000,
"billing_scheme": "per_unit",
"created": 1529987890,
"currency": "usd",
"id": "plan_Do3xCvbzO89OsR",
"interval": "year",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": "annual",
"object": "plan",
"product": "prod_D7NhmicJvX2edE",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 8,
"subscription": "sub_D7OTT8FZbOPxah"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/subscription_items?subscription=sub_D7OTT8FZbOPxah"
},
"livemode": false,
"metadata": {},
"object": "subscription",
"plan": {
"active": true,
"aggregate_usage": null,
"amount": 8000,
"billing_scheme": "per_unit",
"created": 1529987890,
"currency": "usd",
"id": "plan_Do3xCvbzO89OsR",
"interval": "year",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": "annual",
"object": "plan",
"product": "prod_D7NhmicJvX2edE",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 8,
"start": 1529990751,
"status": "active",
"tax_percent": 0.0,
"trial_end": null,
"trial_start": null
},
"customer_with_subscription": {
"account_balance": 0,
"created": 1529990750,
"currency": "usd",
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea",
"address_line1_check": "pass",
"address_line2": null,
"address_state": "FL",
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_D7OT2jf5YAtZQL",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "6dAXT9VZvwro65EK",
"funding": "credit",
"id": "card_1Ch9gVGh0CmXqmnwv94RombT",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_D7OT2jf5YAtZQL",
"invoice_prefix": "23ABC45",
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"object": "customer",
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea",
"address_line1_check": "pass",
"address_line2": null,
"address_state": "FL",
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_D7OT2jf5YAtZQL",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "6dAXT9VZvwro65EK",
"funding": "credit",
"id": "card_1Ch9gVGh0CmXqmnwv94RombT",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_D7OT2jf5YAtZQL/sources"
},
"subscriptions": {
"data": [
{
"application_fee_percent": null,
"billing": "charge_automatically",
"billing_cycle_anchor": 1529990751,
"cancel_at_period_end": false,
"canceled_at": null,
"created": 1529990751,
"current_period_end": 1561526751,
"current_period_start": 1529990751,
"customer": "cus_D7OT2jf5YAtZQL",
"days_until_due": null,
"discount": null,
"ended_at": null,
"id": "sub_D7OTT8FZbOPxah",
"items": {
"data": [
{
"created": 1529990751,
"id": "si_D7OTEItF5ZLN2R",
"metadata": {},
"object": "subscription_item",
"plan": {
"active": true,
"aggregate_usage": null,
"amount": 8000,
"billing_scheme": "per_unit",
"created": 1529987890,
"currency": "usd",
"id": "plan_Do3xCvbzO89OsR",
"interval": "year",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": "annual",
"object": "plan",
"product": "prod_D7NhmicJvX2edE",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 8,
"subscription": "sub_D7OTT8FZbOPxah"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/subscription_items?subscription=sub_D7OTT8FZbOPxah"
},
"livemode": false,
"metadata": {},
"object": "subscription",
"plan": {
"active": true,
"aggregate_usage": null,
"amount": 8000,
"billing_scheme": "per_unit",
"created": 1529987890,
"currency": "usd",
"id": "plan_Do3xCvbzO89OsR",
"interval": "year",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": "annual",
"object": "plan",
"product": "prod_D7NhmicJvX2edE",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 8,
"start": 1529990751,
"status": "active",
"tax_percent": 0.0,
"trial_end": null,
"trial_start": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_D7OT2jf5YAtZQL/subscriptions"
}
},
"upcoming_invoice": {
"amount_due": 64000,
"amount_paid": 0,
"amount_remaining": 64000,
"application_fee": null,
"attempt_count": 0,
"attempted": false,
"billing": "charge_automatically",
"billing_reason": "upcoming",
"charge": null,
"closed": false,
"currency": "usd",
"customer": "cus_D7OT2jf5YAtZQL",
"date": 1561526751,
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"forgiven": false,
"lines": {
"data": [
{
"amount": 64000,
"currency": "usd",
"description": "8 user \u00d7 Zulip Cloud Standard (at $80.00 / year)",
"discountable": true,
"id": "sub_D7OTT8FZbOPxah",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1593149151,
"start": 1561526751
},
"plan": {
"active": true,
"aggregate_usage": null,
"amount": 8000,
"billing_scheme": "per_unit",
"created": 1529987890,
"currency": "usd",
"id": "plan_Do3xCvbzO89OsR",
"interval": "year",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": "annual",
"object": "plan",
"product": "prod_D7NhmicJvX2edE",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"proration": false,
"quantity": 8,
"subscription": null,
"subscription_item": "si_D7OTEItF5ZLN2R",
"type": "subscription"
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/invoices/upcoming/lines?customer=cus_D7OT2jf5YAtZQL"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1561530351,
"number": "23ABC45-0002",
"object": "invoice",
"paid": false,
"period_end": 1561526751,
"period_start": 1529990751,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": null,
"subscription": "sub_D7OTT8FZbOPxah",
"subtotal": 64000,
"tax": 0,
"tax_percent": 0.0,
"total": 64000,
"webhooks_delivered_at": null
},
}

View File

@@ -1,112 +0,0 @@
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 00,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/charges/ch_NORMALIZED00000000000001/refunds"
},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}

View File

@@ -1,112 +0,0 @@
{
"amount": 36000,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000002",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 00,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
"refunded": false,
"refunds": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/charges/ch_NORMALIZED00000000000002/refunds"
},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}

View File

@@ -1,113 +0,0 @@
{
"data": [
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 00,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}
],
"has_more": false,
"object": "list",
"url": "/v1/charges"
}

View File

@@ -1,219 +0,0 @@
{
"data": [
{
"amount": 36000,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000002",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $60.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000002",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 00,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000002",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000002/rcpt_NORMALIZED000000000000000000002",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
},
{
"amount": 7200,
"amount_refunded": 0,
"application": null,
"application_fee": null,
"application_fee_amount": null,
"balance_transaction": "txn_NORMALIZED00000000000001",
"billing_details": {
"address": {
"city": "Pacific",
"country": "United States",
"line1": "Under the sea,",
"line2": null,
"postal_code": "33333",
"state": null
},
"email": null,
"name": "Ada Starr",
"phone": null
},
"captured": true,
"created": 1000000000,
"currency": "usd",
"customer": "cus_NORMALIZED0001",
"description": "Upgrade to Zulip Standard, $12.0 x 6",
"destination": null,
"dispute": null,
"failure_code": null,
"failure_message": null,
"fraud_details": {},
"id": "ch_NORMALIZED00000000000001",
"invoice": null,
"livemode": false,
"metadata": {},
"object": "charge",
"on_behalf_of": null,
"order": null,
"outcome": {
"network_status": "approved_by_network",
"reason": null,
"risk_level": "normal",
"risk_score": 00,
"seller_message": "Payment complete.",
"type": "authorized"
},
"paid": true,
"payment_intent": null,
"payment_method": "card_NORMALIZED00000000000001",
"payment_method_details": {
"card": {
"brand": "visa",
"checks": {
"address_line1_check": "pass",
"address_postal_code_check": "pass",
"cvc_check": "pass"
},
"country": "US",
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"last4": "4242",
"three_d_secure": null,
"wallet": null
},
"type": "card"
},
"receipt_email": "hamlet@zulip.com",
"receipt_number": null,
"receipt_url": "https://pay.stripe.com/receipts/acct_NORMALIZED000001/ch_NORMALIZED00000000000001/rcpt_NORMALIZED000000000000000000001",
"refunded": false,
"refunds": {},
"review": null,
"shipping": null,
"source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"source_transfer": null,
"statement_descriptor": "Zulip Standard",
"statement_descriptor_suffix": "Zulip Standard",
"status": "succeeded",
"transfer_data": null,
"transfer_group": null
}
],
"has_more": false,
"object": "list",
"url": "/v1/charges"
}

View File

@@ -1,79 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": null,
"default_source": "card_NORMALIZED00000000000001",
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,103 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": "usd",
"default_source": {
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
},
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000001",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,79 +0,0 @@
{
"account_balance": 0,
"address": null,
"balance": 0,
"created": 1000000000,
"currency": "usd",
"default_source": "card_NORMALIZED00000000000002",
"delinquent": false,
"description": "zulip (Zulip Dev)",
"discount": null,
"email": "hamlet@zulip.com",
"id": "cus_NORMALIZED0001",
"invoice_prefix": "NORMA01",
"invoice_settings": {
"custom_fields": null,
"default_payment_method": null,
"footer": null
},
"livemode": false,
"metadata": {
"realm_id": "1",
"realm_str": "zulip"
},
"name": null,
"object": "customer",
"phone": null,
"preferred_locales": [],
"shipping": null,
"sources": {
"data": [
{
"address_city": "Pacific",
"address_country": "United States",
"address_line1": "Under the sea,",
"address_line1_check": "pass",
"address_line2": null,
"address_state": null,
"address_zip": "33333",
"address_zip_check": "pass",
"brand": "Visa",
"country": "US",
"customer": "cus_NORMALIZED0001",
"cvc_check": "pass",
"dynamic_last4": null,
"exp_month": 3,
"exp_year": 2033,
"fingerprint": "NORMALIZED000001",
"funding": "credit",
"id": "card_NORMALIZED00000000000002",
"last4": "4242",
"metadata": {},
"name": "Ada Starr",
"object": "card",
"tokenization_method": null
}
],
"has_more": false,
"object": "list",
"total_count": 1,
"url": "/v1/customers/cus_NORMALIZED0001/sources"
},
"subscriptions": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/subscriptions"
},
"tax_exempt": "none",
"tax_ids": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/customers/cus_NORMALIZED0001/tax_ids"
},
"tax_info": null,
"tax_info_verification": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0001",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": false,
"auto_advance": true,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": null,
"ending_balance": null,
"footer": null,
"hosted_invoice_url": null,
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": null,
"lines": {
"data": [
{
"amount": 36000,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -36000,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000004",
"invoice_item": "ii_NORMALIZED00000000000004",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": 1000000000,
"number": "NORMALI-0002",
"object": "invoice",
"paid": false,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "draft",
"status_transitions": {
"finalized_at": null,
"marked_uncollectible_at": null,
"paid_at": null,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": null
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001/pdf",
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0001",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,117 +0,0 @@
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002/pdf",
"lines": {
"data": [
{
"amount": 36000,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -36000,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000004",
"invoice_item": "ii_NORMALIZED00000000000004",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0002",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}

View File

@@ -1,124 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001/pdf",
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0001",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

View File

@@ -1,241 +0,0 @@
{
"data": [
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002",
"id": "in_NORMALIZED00000000000002",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000002/pdf",
"lines": {
"data": [
{
"amount": 36000,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000003",
"invoice_item": "ii_NORMALIZED00000000000003",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -36000,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000004",
"invoice_item": "ii_NORMALIZED00000000000004",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000002/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0002",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
},
{
"account_country": "US",
"account_name": "Dev account",
"amount_due": 0,
"amount_paid": 0,
"amount_remaining": 0,
"application_fee_amount": null,
"attempt_count": 0,
"attempted": true,
"auto_advance": false,
"billing": "charge_automatically",
"billing_reason": "manual",
"charge": null,
"collection_method": "charge_automatically",
"created": 1000000000,
"currency": "usd",
"custom_fields": null,
"customer": "cus_NORMALIZED0001",
"customer_address": null,
"customer_email": "hamlet@zulip.com",
"customer_name": null,
"customer_phone": null,
"customer_shipping": null,
"customer_tax_exempt": "none",
"customer_tax_ids": [],
"default_payment_method": null,
"default_source": null,
"default_tax_rates": [],
"description": "",
"discount": null,
"due_date": 1000000000,
"ending_balance": 0,
"footer": null,
"hosted_invoice_url": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001",
"id": "in_NORMALIZED00000000000001",
"invoice_pdf": "https://pay.stripe.com/invoice/invst_NORMALIZED0000000000000001/pdf",
"lines": {
"data": [
{
"amount": 7200,
"currency": "usd",
"description": "Zulip Standard",
"discountable": false,
"id": "ii_NORMALIZED00000000000001",
"invoice_item": "ii_NORMALIZED00000000000001",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 6,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
},
{
"amount": -7200,
"currency": "usd",
"description": "Payment (Card ending in 4242)",
"discountable": false,
"id": "ii_NORMALIZED00000000000002",
"invoice_item": "ii_NORMALIZED00000000000002",
"livemode": false,
"metadata": {},
"object": "line_item",
"period": {
"end": 1000000000,
"start": 1000000000
},
"plan": null,
"proration": false,
"quantity": 1,
"subscription": null,
"tax_amounts": [],
"tax_rates": [],
"type": "invoiceitem"
}
],
"has_more": false,
"object": "list",
"total_count": 2,
"url": "/v1/invoices/in_NORMALIZED00000000000001/lines"
},
"livemode": false,
"metadata": {},
"next_payment_attempt": null,
"number": "NORMALI-0001",
"object": "invoice",
"paid": true,
"payment_intent": null,
"period_end": 1000000000,
"period_start": 1000000000,
"post_payment_credit_notes_amount": 0,
"pre_payment_credit_notes_amount": 0,
"receipt_number": null,
"starting_balance": 0,
"statement_descriptor": "Zulip Standard",
"status": "paid",
"status_transitions": {
"finalized_at": 1000000000,
"marked_uncollectible_at": null,
"paid_at": 1000000000,
"voided_at": null
},
"subscription": null,
"subtotal": 0,
"tax": null,
"tax_percent": null,
"total": 0,
"total_tax_amounts": [],
"webhooks_delivered_at": 1000000000
}
],
"has_more": false,
"object": "list",
"url": "/v1/invoices"
}

Some files were not shown because too many files have changed in this diff Show More